Mar 10 15:05:34 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 15:05:34 crc restorecon[4683]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:34 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:05:35 crc restorecon[4683]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 15:05:35 crc kubenswrapper[4743]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:05:35 crc kubenswrapper[4743]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 15:05:35 crc kubenswrapper[4743]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:05:35 crc kubenswrapper[4743]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:05:35 crc kubenswrapper[4743]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 15:05:35 crc kubenswrapper[4743]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.708954 4743 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713892 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713906 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713911 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713915 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713919 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713922 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713927 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713930 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713934 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713938 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713942 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713945 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713949 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713952 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713955 4743 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713959 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713967 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713971 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713976 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713981 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713984 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713988 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713991 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713995 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.713998 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714002 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714006 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714010 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714015 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714018 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714022 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714026 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714030 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714033 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714037 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714041 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714045 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714050 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714054 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714058 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714063 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714068 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714072 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714076 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714080 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714086 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714090 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714094 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714107 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714112 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714117 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714122 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714126 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714130 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714134 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714138 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714142 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714145 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714149 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714152 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714156 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714160 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714163 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714167 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714170 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714176 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714181 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714185 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714189 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714192 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.714195 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714267 4743 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714276 4743 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714283 4743 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714289 4743 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714295 4743 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714299 4743 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714305 4743 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714310 4743 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714314 4743 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714318 4743 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714323 4743 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714327 4743 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714331 4743 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714335 4743 flags.go:64] FLAG: --cgroup-root="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714339 4743 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714343 4743 flags.go:64] FLAG: --client-ca-file="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714347 4743 flags.go:64] FLAG: --cloud-config="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714351 4743 flags.go:64] FLAG: --cloud-provider="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714355 4743 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714360 4743 flags.go:64] FLAG: --cluster-domain="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714364 4743 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714368 4743 flags.go:64] FLAG: --config-dir="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714372 4743 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714376 4743 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714382 4743 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714386 4743 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714390 4743 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714395 4743 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714399 4743 flags.go:64] FLAG: --contention-profiling="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714403 4743 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714407 4743 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714411 4743 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714415 4743 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714421 4743 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714425 4743 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714429 4743 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714433 4743 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714437 4743 flags.go:64] FLAG: --enable-server="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714441 4743 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714446 4743 flags.go:64] FLAG: --event-burst="100" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714450 4743 flags.go:64] FLAG: --event-qps="50" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714454 4743 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714459 4743 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714463 4743 flags.go:64] FLAG: --eviction-hard="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714468 4743 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714471 4743 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714475 4743 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714479 4743 flags.go:64] FLAG: --eviction-soft="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714484 4743 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714487 4743 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714491 4743 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714495 4743 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714499 4743 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714503 4743 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714507 4743 flags.go:64] FLAG: --feature-gates="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714514 4743 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714518 4743 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714522 4743 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714527 4743 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714532 4743 flags.go:64] FLAG: --healthz-port="10248" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714536 4743 flags.go:64] FLAG: --help="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714540 4743 flags.go:64] FLAG: --hostname-override="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714544 4743 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714548 4743 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714552 4743 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714556 4743 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714560 4743 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714564 4743 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714568 4743 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714571 4743 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714575 4743 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714579 4743 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714583 4743 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714587 4743 flags.go:64] FLAG: --kube-reserved="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714591 4743 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714595 4743 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714599 4743 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714603 4743 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714607 4743 flags.go:64] FLAG: --lock-file="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714611 4743 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714615 4743 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714619 4743 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714625 4743 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714629 4743 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714633 4743 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714637 4743 flags.go:64] FLAG: --logging-format="text" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714641 4743 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714646 4743 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714650 4743 flags.go:64] FLAG: --manifest-url="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714654 4743 flags.go:64] FLAG: --manifest-url-header="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714659 4743 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714664 4743 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714669 4743 flags.go:64] FLAG: --max-pods="110" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714673 4743 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714677 4743 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714681 4743 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714686 4743 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714690 4743 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714694 4743 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714698 4743 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714707 4743 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714711 4743 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714715 4743 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714719 4743 flags.go:64] FLAG: --pod-cidr="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714723 4743 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714729 4743 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714733 4743 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714737 4743 flags.go:64] FLAG: --pods-per-core="0" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714741 4743 flags.go:64] FLAG: --port="10250" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714745 4743 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714749 4743 flags.go:64] FLAG: --provider-id="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714753 4743 flags.go:64] FLAG: --qos-reserved="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714757 4743 flags.go:64] FLAG: --read-only-port="10255" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714761 4743 flags.go:64] FLAG: --register-node="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714766 4743 flags.go:64] FLAG: --register-schedulable="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714770 4743 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714776 4743 flags.go:64] FLAG: --registry-burst="10" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714780 4743 flags.go:64] FLAG: --registry-qps="5" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714784 4743 flags.go:64] FLAG: --reserved-cpus="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714788 4743 flags.go:64] FLAG: --reserved-memory="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714794 4743 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714798 4743 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714802 4743 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714806 4743 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714825 4743 flags.go:64] FLAG: --runonce="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714829 4743 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714833 4743 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714837 4743 flags.go:64] FLAG: --seccomp-default="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714841 4743 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714845 4743 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714849 4743 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714853 4743 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714857 4743 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714861 4743 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714865 4743 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714869 4743 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714873 4743 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714877 4743 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714881 4743 flags.go:64] FLAG: --system-cgroups="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714885 4743 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714891 4743 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714895 4743 flags.go:64] FLAG: --tls-cert-file="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714899 4743 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714904 4743 flags.go:64] FLAG: --tls-min-version="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714907 4743 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714911 4743 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714916 4743 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714919 4743 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714923 4743 flags.go:64] FLAG: --v="2" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714929 4743 flags.go:64] FLAG: --version="false" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714935 4743 flags.go:64] FLAG: --vmodule="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714940 4743 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.714945 4743 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717073 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717087 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717094 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717099 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717104 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717108 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717112 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717116 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717119 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717123 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717131 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717135 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717139 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717142 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717147 4743 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717150 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717154 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717157 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717161 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717165 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717168 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717172 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717176 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717181 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717185 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717189 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717193 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717198 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717202 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717205 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717209 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717218 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717222 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717227 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717231 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717235 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717241 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717245 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717248 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717252 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717256 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717260 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717263 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717267 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717270 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717274 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717278 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717282 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717290 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717295 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717299 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717303 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717307 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717311 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717315 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717319 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717323 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717326 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717330 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717334 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717340 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717344 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717347 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717353 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717356 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717380 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717389 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717637 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717648 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717654 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.717659 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.717669 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.727880 4743 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.727906 4743 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728007 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728016 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728024 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728033 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728041 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728048 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728054 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728061 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728068 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728075 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728082 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728089 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728096 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728103 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728109 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728117 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728123 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728130 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728136 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728145 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728154 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728161 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728167 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728174 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728180 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728187 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728194 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728201 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728208 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728214 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728222 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728228 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728234 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728242 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728248 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728255 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728262 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728268 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728274 4743 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728281 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728287 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728294 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728300 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728307 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728313 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728319 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728326 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728335 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728345 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728353 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728360 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728369 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728376 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728384 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728392 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728398 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728405 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728412 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728419 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728426 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728433 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728439 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728447 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728454 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728460 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728468 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728475 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728482 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728489 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728496 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728502 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.728514 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728701 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728712 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728719 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728726 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728732 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728738 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728743 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728750 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728757 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728763 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728769 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728775 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728781 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728786 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728791 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728797 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728802 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728807 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728832 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728840 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728847 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728853 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728860 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728868 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728874 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728880 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728886 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728891 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728896 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728901 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728906 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728912 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728917 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728924 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728931 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728937 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728943 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728949 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728954 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728960 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728965 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728971 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728976 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728982 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728987 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728992 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.728997 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729002 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729007 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729013 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729018 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729023 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729029 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729034 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729039 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729044 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729049 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729055 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729060 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729066 4743 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729071 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729077 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729084 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729091 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729097 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729104 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729111 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729117 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729123 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729129 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.729134 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.729142 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.730409 4743 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.734496 4743 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.737517 4743 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.737603 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.739465 4743 server.go:997] "Starting client certificate rotation" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.739493 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.739637 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.763287 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.764971 4743 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.766667 4743 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.779257 4743 log.go:25] "Validated CRI v1 runtime API" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.811452 4743 log.go:25] "Validated CRI v1 image API" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.812874 4743 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.819634 4743 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-15-00-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.819668 4743 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.834503 4743 manager.go:217] Machine: {Timestamp:2026-03-10 15:05:35.83188928 +0000 UTC m=+0.538704038 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d399f706-59cf-40ea-a3ad-c58404098384 BootID:5e532390-ce89-4ad9-81e4-f384a9988976 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:22:48:4d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:22:48:4d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b6:ee:b7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:46:e2:20 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b2:93:9a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:87:2a:5c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:22:c2:f0:b6:dc:81 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:1d:42:11:a2:1a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.834709 4743 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.834838 4743 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.835677 4743 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.835836 4743 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.835864 4743 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.836032 4743 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.836042 4743 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.836593 4743 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.836617 4743 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.837281 4743 state_mem.go:36] "Initialized new in-memory state store" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.837360 4743 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.842947 4743 kubelet.go:418] "Attempting to sync node with API server" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.842964 4743 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.843003 4743 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.843015 4743 kubelet.go:324] "Adding apiserver pod source" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.843024 4743 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.847916 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.848012 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.847972 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.848084 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.848285 4743 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.849213 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.852889 4743 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854289 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854318 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854328 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854338 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854353 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854363 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854372 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854386 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854398 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854407 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854420 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.854429 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.855185 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.855633 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.855686 4743 server.go:1280] "Started kubelet" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.855750 4743 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.856578 4743 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.857088 4743 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 15:05:35 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.863194 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.863238 4743 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.864654 4743 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.864686 4743 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.864744 4743 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.864478 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.869514 4743 server.go:460] "Adding debug handlers to kubelet server" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.869876 4743 factory.go:55] Registering systemd factory Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.869908 4743 factory.go:221] Registration of the systemd container factory successfully Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.869833 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.869991 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.870165 4743 factory.go:153] Registering CRI-O factory Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.870192 4743 factory.go:221] Registration of the crio container factory successfully Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.870243 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="200ms" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.870280 4743 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.870308 4743 factory.go:103] Registering Raw factory Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.870330 4743 manager.go:1196] Started watching for new ooms in manager Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.870992 4743 manager.go:319] Starting recovery of all containers Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.869869 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.115:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b83339455ff80 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.855656832 +0000 UTC m=+0.562471590,LastTimestamp:2026-03-10 15:05:35.855656832 +0000 UTC m=+0.562471590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875458 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875495 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875506 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875522 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875532 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875541 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875549 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875558 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875571 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875579 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875589 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875616 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875625 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875635 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875643 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875656 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875669 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875682 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875694 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875706 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875718 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875730 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875742 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875755 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875769 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875782 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875798 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875830 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875846 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875859 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875870 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875883 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875894 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875908 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875919 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875931 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875944 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875956 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875969 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875981 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.875996 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876045 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876059 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876073 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876088 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876103 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876115 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876129 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876142 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876154 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876171 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876184 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876201 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876214 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876227 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876240 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876254 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876267 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876280 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876293 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876305 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876317 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876329 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876343 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876356 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876366 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876378 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876388 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876400 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876424 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876435 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876446 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876459 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876470 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876481 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876493 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876504 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876534 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876546 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876557 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876569 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876580 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876591 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876602 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876613 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876625 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876636 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876647 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876659 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876669 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876680 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876691 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876700 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876710 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876720 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876730 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876740 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876751 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876763 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876775 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876787 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876802 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876832 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876846 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876866 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876879 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876889 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876899 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876909 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876918 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876928 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876937 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876947 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876957 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.876992 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.877003 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.877015 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.877026 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878472 4743 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878499 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878514 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878528 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878540 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878552 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878563 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878575 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878585 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878598 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878609 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878620 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878632 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878643 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878654 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878668 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878681 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878702 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878716 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878729 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878740 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878751 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878763 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878774 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878785 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878798 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878824 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878837 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878849 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878860 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878872 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878883 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878895 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878906 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878918 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878929 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878940 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878951 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878962 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878972 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.878983 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879006 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879017 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879027 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879039 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879050 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879061 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879072 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879083 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879100 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879114 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879125 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879136 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879149 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879161 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879173 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879184 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879195 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879205 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879215 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879226 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879236 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879261 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879271 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879281 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879291 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879300 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879338 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879349 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879360 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879370 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879380 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879392 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879403 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879414 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879425 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879436 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879447 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879457 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879468 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879481 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879495 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879508 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879520 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879532 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879543 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879554 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879564 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879577 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879588 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879599 4743 reconstruct.go:97] "Volume reconstruction finished" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.879608 4743 reconciler.go:26] "Reconciler: start to sync state" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.892865 4743 manager.go:324] Recovery completed Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.905138 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.906968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.907001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.907011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.908151 4743 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.908171 4743 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.908190 4743 state_mem.go:36] "Initialized new in-memory state store" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.911586 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.914050 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.914087 4743 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.914107 4743 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.914148 4743 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 15:05:35 crc kubenswrapper[4743]: W0310 15:05:35.914840 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.914888 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.933325 4743 policy_none.go:49] "None policy: Start" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.934411 4743 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.934468 4743 state_mem.go:35] "Initializing new in-memory state store" Mar 10 15:05:35 crc kubenswrapper[4743]: E0310 15:05:35.965166 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.993605 4743 manager.go:334] "Starting Device Plugin manager" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.993655 4743 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.993672 4743 server.go:79] "Starting device plugin registration server" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.994157 4743 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.994174 4743 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.994349 4743 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.994441 4743 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 15:05:35 crc kubenswrapper[4743]: I0310 15:05:35.994460 4743 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 15:05:36 crc kubenswrapper[4743]: E0310 15:05:36.004772 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.014870 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.014948 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.015975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.016013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.016023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.016166 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.016356 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.016388 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017069 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017224 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017449 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017521 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017831 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017854 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.017863 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018092 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018231 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018386 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018417 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.018989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.019164 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.019229 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.019255 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.019878 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.019915 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.019924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020078 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020453 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.020983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.021050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.021070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: E0310 15:05:36.070779 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="400ms" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.080882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.080944 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.080975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081018 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081136 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081174 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081213 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081487 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.081517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.095300 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.098304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.098345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.098367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.098453 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:05:36 crc kubenswrapper[4743]: E0310 15:05:36.099608 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182435 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182541 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182570 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182600 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182635 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182680 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182668 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182694 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182706 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182782 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182754 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182793 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182859 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182893 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182891 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182886 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182857 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182951 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.182942 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.300503 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.301629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.301655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.301664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.301695 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:05:36 crc kubenswrapper[4743]: E0310 15:05:36.302190 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.344030 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.357109 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.364344 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.378956 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.384987 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:36 crc kubenswrapper[4743]: W0310 15:05:36.386542 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b7e4c4501c6d542a9575bdb11a8e2b6e058358aee348c6386c1490e6a635483f WatchSource:0}: Error finding container b7e4c4501c6d542a9575bdb11a8e2b6e058358aee348c6386c1490e6a635483f: Status 404 returned error can't find the container with id b7e4c4501c6d542a9575bdb11a8e2b6e058358aee348c6386c1490e6a635483f Mar 10 15:05:36 crc kubenswrapper[4743]: W0310 15:05:36.386942 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d41a1eaad34bf7f910c38b9ae30aa265609ed2c1be92686b5c1eeeddaa018efc WatchSource:0}: Error finding container d41a1eaad34bf7f910c38b9ae30aa265609ed2c1be92686b5c1eeeddaa018efc: Status 404 returned error can't find the container with id d41a1eaad34bf7f910c38b9ae30aa265609ed2c1be92686b5c1eeeddaa018efc Mar 10 15:05:36 crc kubenswrapper[4743]: W0310 15:05:36.391710 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-19c2f725682cd4a7fec1cee53d98e47ab576945da5a27c668fa4d4bb64d30260 WatchSource:0}: Error finding container 19c2f725682cd4a7fec1cee53d98e47ab576945da5a27c668fa4d4bb64d30260: Status 404 returned error can't find the container with id 19c2f725682cd4a7fec1cee53d98e47ab576945da5a27c668fa4d4bb64d30260 Mar 10 15:05:36 crc kubenswrapper[4743]: W0310 15:05:36.398903 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-688104e92da8291cfe782c9878972ed2e183fc929180b8377fae5b232497158a WatchSource:0}: Error finding container 688104e92da8291cfe782c9878972ed2e183fc929180b8377fae5b232497158a: Status 404 returned error can't find the container with id 688104e92da8291cfe782c9878972ed2e183fc929180b8377fae5b232497158a Mar 10 15:05:36 crc kubenswrapper[4743]: W0310 15:05:36.401355 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b7d6747e660efaafb5e1a162996ddb038a5f54d4500143026349fc87d48d4feb WatchSource:0}: Error finding container b7d6747e660efaafb5e1a162996ddb038a5f54d4500143026349fc87d48d4feb: Status 404 returned error can't find the container with id b7d6747e660efaafb5e1a162996ddb038a5f54d4500143026349fc87d48d4feb Mar 10 15:05:36 crc kubenswrapper[4743]: E0310 15:05:36.471614 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="800ms" Mar 10 15:05:36 crc kubenswrapper[4743]: W0310 15:05:36.670611 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:36 crc kubenswrapper[4743]: E0310 15:05:36.670694 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.703043 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.704328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.704361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.704369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.704388 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:05:36 crc kubenswrapper[4743]: E0310 15:05:36.704972 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Mar 10 15:05:36 crc kubenswrapper[4743]: W0310 15:05:36.821063 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:36 crc kubenswrapper[4743]: E0310 15:05:36.821146 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.856352 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.918571 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"688104e92da8291cfe782c9878972ed2e183fc929180b8377fae5b232497158a"} Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.921001 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"19c2f725682cd4a7fec1cee53d98e47ab576945da5a27c668fa4d4bb64d30260"} Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.922201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b7e4c4501c6d542a9575bdb11a8e2b6e058358aee348c6386c1490e6a635483f"} Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.923436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d41a1eaad34bf7f910c38b9ae30aa265609ed2c1be92686b5c1eeeddaa018efc"} Mar 10 15:05:36 crc kubenswrapper[4743]: I0310 15:05:36.924426 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b7d6747e660efaafb5e1a162996ddb038a5f54d4500143026349fc87d48d4feb"} Mar 10 15:05:37 crc kubenswrapper[4743]: W0310 15:05:37.197339 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:37 crc kubenswrapper[4743]: E0310 15:05:37.197428 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:37 crc kubenswrapper[4743]: E0310 15:05:37.272994 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="1.6s" Mar 10 15:05:37 crc kubenswrapper[4743]: W0310 15:05:37.395674 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:37 crc kubenswrapper[4743]: E0310 15:05:37.395785 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.505481 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.507560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.507608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.507620 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.507653 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:05:37 crc kubenswrapper[4743]: E0310 15:05:37.508323 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.857691 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.882234 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:05:37 crc kubenswrapper[4743]: E0310 15:05:37.883354 4743 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.930102 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f" exitCode=0 Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.930238 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.930239 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f"} Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.932381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.932430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.932444 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.932683 4743 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e" exitCode=0 Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.932757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e"} Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.932809 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.933728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.933753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.933764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.934383 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.935189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.935231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.935245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.936219 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64"} Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.936284 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33"} Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.936306 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3"} Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.938113 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d" exitCode=0 Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.938207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d"} Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.938388 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.939577 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.939626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.939636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.939878 4743 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898" exitCode=0 Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.939915 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898"} Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.939963 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.940662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.940689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:37 crc kubenswrapper[4743]: I0310 15:05:37.940700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.857362 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:38 crc kubenswrapper[4743]: E0310 15:05:38.874618 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="3.2s" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.945668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.945733 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.945748 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.945764 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.947776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.947832 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.947862 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.950185 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.950229 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.951365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.951413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.951428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.951729 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1" exitCode=0 Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.951799 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.951846 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.952645 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.952671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.952683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.954804 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.954886 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.955671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.955690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.955699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.961043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.961067 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.961077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636"} Mar 10 15:05:38 crc kubenswrapper[4743]: I0310 15:05:38.961086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed"} Mar 10 15:05:38 crc kubenswrapper[4743]: W0310 15:05:38.990885 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:38 crc kubenswrapper[4743]: E0310 15:05:38.990966 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.109090 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.110905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.110952 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.110963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.110991 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:05:39 crc kubenswrapper[4743]: E0310 15:05:39.111546 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.115:6443: connect: connection refused" node="crc" Mar 10 15:05:39 crc kubenswrapper[4743]: W0310 15:05:39.337197 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:05:39 crc kubenswrapper[4743]: E0310 15:05:39.337305 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.747032 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.966414 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931" exitCode=0 Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.966473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931"} Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.966543 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.967637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.967660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.967671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.968793 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.971561 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0559ac1b0dab295bd8a8cd05a2e5b375fa1dd8e9fc19409c020b489e37723af8" exitCode=255 Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.971666 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.971720 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.971740 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.971766 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.971767 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.971808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0559ac1b0dab295bd8a8cd05a2e5b375fa1dd8e9fc19409c020b489e37723af8"} Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.973284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.973321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.973338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.974072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.974112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.974149 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.974342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.974395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.974411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.976019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.976103 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.976124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:39 crc kubenswrapper[4743]: I0310 15:05:39.978058 4743 scope.go:117] "RemoveContainer" containerID="0559ac1b0dab295bd8a8cd05a2e5b375fa1dd8e9fc19409c020b489e37723af8" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.976386 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.978313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3"} Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.978461 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.978475 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.979368 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.979399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.979410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.982724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412"} Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.982752 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529"} Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.982766 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef"} Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.982778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294"} Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.982791 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.982794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db"} Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.982780 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.983480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.983502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.983510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.983606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.983625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:40 crc kubenswrapper[4743]: I0310 15:05:40.983635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.950574 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.985481 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.985505 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.985481 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.986493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.986530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.986493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.986542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.986625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:41 crc kubenswrapper[4743]: I0310 15:05:41.986647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.050603 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.312188 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.313779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.313810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.313831 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.313850 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.533924 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.728028 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.728251 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.729278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.729312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.729323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.987705 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.987705 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.988778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.988852 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.988870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.989098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.989125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:42 crc kubenswrapper[4743]: I0310 15:05:42.989136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.146420 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.146669 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.148147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.148198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.148208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.153578 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.989319 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.990343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.990382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:43 crc kubenswrapper[4743]: I0310 15:05:43.990391 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.493120 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.493390 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.495173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.495244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.495266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.756026 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.756213 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.757237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.757270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:44 crc kubenswrapper[4743]: I0310 15:05:44.757278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:46 crc kubenswrapper[4743]: E0310 15:05:46.005616 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:05:46 crc kubenswrapper[4743]: I0310 15:05:46.298924 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:05:46 crc kubenswrapper[4743]: I0310 15:05:46.299067 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:46 crc kubenswrapper[4743]: I0310 15:05:46.300141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:46 crc kubenswrapper[4743]: I0310 15:05:46.300175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:46 crc kubenswrapper[4743]: I0310 15:05:46.300187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:48 crc kubenswrapper[4743]: I0310 15:05:48.588719 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:48 crc kubenswrapper[4743]: I0310 15:05:48.589092 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:48 crc kubenswrapper[4743]: I0310 15:05:48.591279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:48 crc kubenswrapper[4743]: I0310 15:05:48.591341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:48 crc kubenswrapper[4743]: I0310 15:05:48.591364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:48 crc kubenswrapper[4743]: I0310 15:05:48.598280 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:05:49 crc kubenswrapper[4743]: I0310 15:05:49.011215 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:49 crc kubenswrapper[4743]: I0310 15:05:49.012614 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:49 crc kubenswrapper[4743]: I0310 15:05:49.012678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:49 crc kubenswrapper[4743]: I0310 15:05:49.012705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:49 crc kubenswrapper[4743]: W0310 15:05:49.705277 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 15:05:49 crc kubenswrapper[4743]: I0310 15:05:49.705398 4743 trace.go:236] Trace[1440102936]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 15:05:39.703) (total time: 10001ms): Mar 10 15:05:49 crc kubenswrapper[4743]: Trace[1440102936]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:05:49.705) Mar 10 15:05:49 crc kubenswrapper[4743]: Trace[1440102936]: [10.0015374s] [10.0015374s] END Mar 10 15:05:49 crc kubenswrapper[4743]: E0310 15:05:49.705432 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 15:05:49 crc kubenswrapper[4743]: I0310 15:05:49.857876 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 15:05:50 crc kubenswrapper[4743]: W0310 15:05:50.312111 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 15:05:50 crc kubenswrapper[4743]: I0310 15:05:50.312231 4743 trace.go:236] Trace[1248632735]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 15:05:40.310) (total time: 10001ms): Mar 10 15:05:50 crc kubenswrapper[4743]: Trace[1248632735]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:05:50.312) Mar 10 15:05:50 crc kubenswrapper[4743]: Trace[1248632735]: [10.001470079s] [10.001470079s] END Mar 10 15:05:50 crc kubenswrapper[4743]: E0310 15:05:50.312263 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 15:05:50 crc kubenswrapper[4743]: E0310 15:05:50.349670 4743 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:05:50 crc kubenswrapper[4743]: E0310 15:05:50.351532 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 15:05:50 crc kubenswrapper[4743]: E0310 15:05:50.353310 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:05:50 crc kubenswrapper[4743]: W0310 15:05:50.356436 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z Mar 10 15:05:50 crc kubenswrapper[4743]: E0310 15:05:50.356536 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:05:50 crc kubenswrapper[4743]: E0310 15:05:50.357342 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b83339455ff80 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.855656832 +0000 UTC m=+0.562471590,LastTimestamp:2026-03-10 15:05:35.855656832 +0000 UTC m=+0.562471590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:05:50 crc kubenswrapper[4743]: W0310 15:05:50.358491 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z Mar 10 15:05:50 crc kubenswrapper[4743]: E0310 15:05:50.358595 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:05:50 crc kubenswrapper[4743]: I0310 15:05:50.361831 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 15:05:50 crc kubenswrapper[4743]: I0310 15:05:50.361925 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 15:05:50 crc kubenswrapper[4743]: I0310 15:05:50.365851 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 10 15:05:50 crc kubenswrapper[4743]: I0310 15:05:50.365953 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 15:05:50 crc kubenswrapper[4743]: I0310 15:05:50.860154 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:50Z is after 2026-02-23T05:33:13Z Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.017715 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.018101 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.019396 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3" exitCode=255 Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.019439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3"} Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.019479 4743 scope.go:117] "RemoveContainer" containerID="0559ac1b0dab295bd8a8cd05a2e5b375fa1dd8e9fc19409c020b489e37723af8" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.019620 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.020766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.020800 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.020829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.021444 4743 scope.go:117] "RemoveContainer" containerID="018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3" Mar 10 15:05:51 crc kubenswrapper[4743]: E0310 15:05:51.021638 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.589019 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.589128 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:05:51 crc kubenswrapper[4743]: I0310 15:05:51.859735 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:51Z is after 2026-02-23T05:33:13Z Mar 10 15:05:52 crc kubenswrapper[4743]: I0310 15:05:52.023105 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:05:52 crc kubenswrapper[4743]: I0310 15:05:52.573684 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 15:05:52 crc kubenswrapper[4743]: I0310 15:05:52.574024 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:52 crc kubenswrapper[4743]: I0310 15:05:52.576044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:52 crc kubenswrapper[4743]: I0310 15:05:52.576091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:52 crc kubenswrapper[4743]: I0310 15:05:52.576109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:52 crc kubenswrapper[4743]: I0310 15:05:52.590227 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 15:05:52 crc kubenswrapper[4743]: I0310 15:05:52.861816 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:52Z is after 2026-02-23T05:33:13Z Mar 10 15:05:53 crc kubenswrapper[4743]: I0310 15:05:53.028068 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:53 crc kubenswrapper[4743]: I0310 15:05:53.029568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:53 crc kubenswrapper[4743]: I0310 15:05:53.029638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:53 crc kubenswrapper[4743]: I0310 15:05:53.029661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:53 crc kubenswrapper[4743]: I0310 15:05:53.860712 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:53Z is after 2026-02-23T05:33:13Z Mar 10 15:05:54 crc kubenswrapper[4743]: W0310 15:05:54.467504 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:54Z is after 2026-02-23T05:33:13Z Mar 10 15:05:54 crc kubenswrapper[4743]: E0310 15:05:54.467598 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:05:54 crc kubenswrapper[4743]: I0310 15:05:54.498844 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:54 crc kubenswrapper[4743]: I0310 15:05:54.499017 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:54 crc kubenswrapper[4743]: I0310 15:05:54.500297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:54 crc kubenswrapper[4743]: I0310 15:05:54.500342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:54 crc kubenswrapper[4743]: I0310 15:05:54.500354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:54 crc kubenswrapper[4743]: I0310 15:05:54.500904 4743 scope.go:117] "RemoveContainer" containerID="018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3" Mar 10 15:05:54 crc kubenswrapper[4743]: E0310 15:05:54.501112 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:05:54 crc kubenswrapper[4743]: I0310 15:05:54.502501 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:54 crc kubenswrapper[4743]: W0310 15:05:54.841983 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:54Z is after 2026-02-23T05:33:13Z Mar 10 15:05:54 crc kubenswrapper[4743]: E0310 15:05:54.842804 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:05:54 crc kubenswrapper[4743]: I0310 15:05:54.860049 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:54Z is after 2026-02-23T05:33:13Z Mar 10 15:05:55 crc kubenswrapper[4743]: I0310 15:05:55.036232 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:55 crc kubenswrapper[4743]: I0310 15:05:55.038464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:55 crc kubenswrapper[4743]: I0310 15:05:55.038498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:55 crc kubenswrapper[4743]: I0310 15:05:55.038509 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:55 crc kubenswrapper[4743]: I0310 15:05:55.040479 4743 scope.go:117] "RemoveContainer" containerID="018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3" Mar 10 15:05:55 crc kubenswrapper[4743]: E0310 15:05:55.040894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:05:55 crc kubenswrapper[4743]: I0310 15:05:55.860289 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:55Z is after 2026-02-23T05:33:13Z Mar 10 15:05:56 crc kubenswrapper[4743]: E0310 15:05:56.010218 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:05:56 crc kubenswrapper[4743]: I0310 15:05:56.753900 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:56 crc kubenswrapper[4743]: I0310 15:05:56.755365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:56 crc kubenswrapper[4743]: I0310 15:05:56.755510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:56 crc kubenswrapper[4743]: I0310 15:05:56.755539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:56 crc kubenswrapper[4743]: I0310 15:05:56.755593 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:05:56 crc kubenswrapper[4743]: E0310 15:05:56.757320 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:56Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 15:05:56 crc kubenswrapper[4743]: E0310 15:05:56.767440 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:05:56 crc kubenswrapper[4743]: I0310 15:05:56.862063 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:56Z is after 2026-02-23T05:33:13Z Mar 10 15:05:57 crc kubenswrapper[4743]: I0310 15:05:57.860472 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:57Z is after 2026-02-23T05:33:13Z Mar 10 15:05:57 crc kubenswrapper[4743]: I0310 15:05:57.966113 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:05:57 crc kubenswrapper[4743]: I0310 15:05:57.966347 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:05:57 crc kubenswrapper[4743]: I0310 15:05:57.967659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:05:57 crc kubenswrapper[4743]: I0310 15:05:57.967705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:05:57 crc kubenswrapper[4743]: I0310 15:05:57.967740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:05:57 crc kubenswrapper[4743]: I0310 15:05:57.968462 4743 scope.go:117] "RemoveContainer" containerID="018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3" Mar 10 15:05:57 crc kubenswrapper[4743]: E0310 15:05:57.968649 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:05:58 crc kubenswrapper[4743]: I0310 15:05:58.576791 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:05:58 crc kubenswrapper[4743]: E0310 15:05:58.582269 4743 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:05:58 crc kubenswrapper[4743]: I0310 15:05:58.859271 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:58Z is after 2026-02-23T05:33:13Z Mar 10 15:05:59 crc kubenswrapper[4743]: W0310 15:05:59.337475 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:59Z is after 2026-02-23T05:33:13Z Mar 10 15:05:59 crc kubenswrapper[4743]: E0310 15:05:59.337572 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:05:59 crc kubenswrapper[4743]: I0310 15:05:59.860440 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:59Z is after 2026-02-23T05:33:13Z Mar 10 15:05:59 crc kubenswrapper[4743]: W0310 15:05:59.997440 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:59Z is after 2026-02-23T05:33:13Z Mar 10 15:05:59 crc kubenswrapper[4743]: E0310 15:05:59.997546 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:05:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:00 crc kubenswrapper[4743]: E0310 15:06:00.366362 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b83339455ff80 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.855656832 +0000 UTC m=+0.562471590,LastTimestamp:2026-03-10 15:05:35.855656832 +0000 UTC m=+0.562471590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:00 crc kubenswrapper[4743]: I0310 15:06:00.861404 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:00Z is after 2026-02-23T05:33:13Z Mar 10 15:06:01 crc kubenswrapper[4743]: I0310 15:06:01.589136 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:06:01 crc kubenswrapper[4743]: I0310 15:06:01.589265 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:06:01 crc kubenswrapper[4743]: I0310 15:06:01.861741 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:01Z is after 2026-02-23T05:33:13Z Mar 10 15:06:02 crc kubenswrapper[4743]: I0310 15:06:02.861034 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:02Z is after 2026-02-23T05:33:13Z Mar 10 15:06:03 crc kubenswrapper[4743]: E0310 15:06:03.760692 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:03Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 15:06:03 crc kubenswrapper[4743]: I0310 15:06:03.768125 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:03 crc kubenswrapper[4743]: I0310 15:06:03.770152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:03 crc kubenswrapper[4743]: I0310 15:06:03.770378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:03 crc kubenswrapper[4743]: I0310 15:06:03.770459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:03 crc kubenswrapper[4743]: I0310 15:06:03.770548 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:03 crc kubenswrapper[4743]: E0310 15:06:03.774020 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:06:03 crc kubenswrapper[4743]: I0310 15:06:03.859921 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:03Z is after 2026-02-23T05:33:13Z Mar 10 15:06:04 crc kubenswrapper[4743]: I0310 15:06:04.861747 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:04Z is after 2026-02-23T05:33:13Z Mar 10 15:06:05 crc kubenswrapper[4743]: W0310 15:06:05.492333 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:05Z is after 2026-02-23T05:33:13Z Mar 10 15:06:05 crc kubenswrapper[4743]: E0310 15:06:05.492429 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:05 crc kubenswrapper[4743]: I0310 15:06:05.858566 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:05Z is after 2026-02-23T05:33:13Z Mar 10 15:06:06 crc kubenswrapper[4743]: E0310 15:06:06.010506 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:06 crc kubenswrapper[4743]: W0310 15:06:06.488622 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:06Z is after 2026-02-23T05:33:13Z Mar 10 15:06:06 crc kubenswrapper[4743]: E0310 15:06:06.489073 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:06 crc kubenswrapper[4743]: I0310 15:06:06.859639 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:06Z is after 2026-02-23T05:33:13Z Mar 10 15:06:07 crc kubenswrapper[4743]: I0310 15:06:07.864032 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.588777 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused" start-of-body= Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.588915 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused" Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.588998 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.589208 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.591013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.591084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.591107 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.591964 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.592319 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33" gracePeriod=30 Mar 10 15:06:08 crc kubenswrapper[4743]: I0310 15:06:08.861656 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.080211 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.080911 4743 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33" exitCode=255 Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.080965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33"} Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.081002 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c"} Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.081111 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.081946 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.081990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.082004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:09 crc kubenswrapper[4743]: I0310 15:06:09.860727 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.372228 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339455ff80 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.855656832 +0000 UTC m=+0.562471590,LastTimestamp:2026-03-10 15:05:35.855656832 +0000 UTC m=+0.562471590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.384237 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397654221 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,LastTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.389142 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397658ebc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,LastTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.392645 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339765ab23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907015459 +0000 UTC m=+0.613830207,LastTimestamp:2026-03-10 15:05:35.907015459 +0000 UTC m=+0.613830207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.397123 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397654221\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397654221 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,LastTimestamp:2026-03-10 15:05:36.016000047 +0000 UTC m=+0.722814785,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.401151 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397658ebc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397658ebc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,LastTimestamp:2026-03-10 15:05:36.016019607 +0000 UTC m=+0.722834355,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.405045 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b83339765ab23\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339765ab23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907015459 +0000 UTC m=+0.613830207,LastTimestamp:2026-03-10 15:05:36.016028748 +0000 UTC m=+0.722843496,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.408484 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339df16468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:36.016835688 +0000 UTC m=+0.723650436,LastTimestamp:2026-03-10 15:05:36.016835688 +0000 UTC m=+0.723650436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.411989 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397654221\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397654221 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,LastTimestamp:2026-03-10 15:05:36.01705932 +0000 UTC m=+0.723874068,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.415530 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397658ebc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397658ebc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,LastTimestamp:2026-03-10 15:05:36.017078061 +0000 UTC m=+0.723892809,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.419291 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b83339765ab23\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339765ab23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907015459 +0000 UTC m=+0.613830207,LastTimestamp:2026-03-10 15:05:36.017088731 +0000 UTC m=+0.723903469,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.423036 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397654221\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397654221 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,LastTimestamp:2026-03-10 15:05:36.017847989 +0000 UTC m=+0.724662737,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.427014 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397658ebc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397658ebc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,LastTimestamp:2026-03-10 15:05:36.0178612 +0000 UTC m=+0.724675948,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.430871 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b83339765ab23\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339765ab23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907015459 +0000 UTC m=+0.613830207,LastTimestamp:2026-03-10 15:05:36.01786809 +0000 UTC m=+0.724682838,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.434411 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397654221\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397654221 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,LastTimestamp:2026-03-10 15:05:36.018108852 +0000 UTC m=+0.724923600,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.438157 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397658ebc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397658ebc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,LastTimestamp:2026-03-10 15:05:36.018122263 +0000 UTC m=+0.724937011,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.441743 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b83339765ab23\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339765ab23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907015459 +0000 UTC m=+0.613830207,LastTimestamp:2026-03-10 15:05:36.018132904 +0000 UTC m=+0.724947652,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.445326 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397654221\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397654221 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,LastTimestamp:2026-03-10 15:05:36.01866058 +0000 UTC m=+0.725475328,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.448598 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397658ebc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397658ebc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,LastTimestamp:2026-03-10 15:05:36.018676361 +0000 UTC m=+0.725491109,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.452026 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b83339765ab23\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339765ab23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907015459 +0000 UTC m=+0.613830207,LastTimestamp:2026-03-10 15:05:36.018684801 +0000 UTC m=+0.725499549,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.455084 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397654221\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397654221 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,LastTimestamp:2026-03-10 15:05:36.018940574 +0000 UTC m=+0.725755312,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.458018 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397658ebc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397658ebc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,LastTimestamp:2026-03-10 15:05:36.018979036 +0000 UTC m=+0.725793784,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.461114 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b83339765ab23\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b83339765ab23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907015459 +0000 UTC m=+0.613830207,LastTimestamp:2026-03-10 15:05:36.018996037 +0000 UTC m=+0.725810775,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.465415 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397654221\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397654221 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.906988577 +0000 UTC m=+0.613803315,LastTimestamp:2026-03-10 15:05:36.019892222 +0000 UTC m=+0.726706970,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.469269 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833397658ebc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833397658ebc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:35.907008188 +0000 UTC m=+0.613822936,LastTimestamp:2026-03-10 15:05:36.019921663 +0000 UTC m=+0.726736411,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.473824 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b8333b44934b2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:36.391689394 +0000 UTC m=+1.098504152,LastTimestamp:2026-03-10 15:05:36.391689394 +0000 UTC m=+1.098504152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.476923 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8333b463904c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:36.39341678 +0000 UTC m=+1.100231528,LastTimestamp:2026-03-10 15:05:36.39341678 +0000 UTC m=+1.100231528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.480708 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333b4851da1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:36.395615649 +0000 UTC m=+1.102430397,LastTimestamp:2026-03-10 15:05:36.395615649 +0000 UTC m=+1.102430397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.484106 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8333b4cdbf2b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:36.400375595 +0000 UTC m=+1.107190343,LastTimestamp:2026-03-10 15:05:36.400375595 +0000 UTC m=+1.107190343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.488614 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8333b4f865b3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:36.403170739 +0000 UTC m=+1.109985487,LastTimestamp:2026-03-10 15:05:36.403170739 +0000 UTC m=+1.109985487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.492384 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333e4cdb6ef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.205679855 +0000 UTC m=+1.912494603,LastTimestamp:2026-03-10 15:05:37.205679855 +0000 UTC m=+1.912494603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.495763 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b8333e4cfc5dc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.205814748 +0000 UTC m=+1.912629496,LastTimestamp:2026-03-10 15:05:37.205814748 +0000 UTC m=+1.912629496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.499480 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8333e4cfe6df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.205823199 +0000 UTC m=+1.912637947,LastTimestamp:2026-03-10 15:05:37.205823199 +0000 UTC m=+1.912637947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.502777 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8333e4d12353 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.205904211 +0000 UTC m=+1.912719009,LastTimestamp:2026-03-10 15:05:37.205904211 +0000 UTC m=+1.912719009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.506201 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8333e4d1987a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.205934202 +0000 UTC m=+1.912748950,LastTimestamp:2026-03-10 15:05:37.205934202 +0000 UTC m=+1.912748950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.510924 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8333e5ba382c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.221179436 +0000 UTC m=+1.927994184,LastTimestamp:2026-03-10 15:05:37.221179436 +0000 UTC m=+1.927994184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.514176 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333e5f53a29 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.225046569 +0000 UTC m=+1.931861317,LastTimestamp:2026-03-10 15:05:37.225046569 +0000 UTC m=+1.931861317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.517103 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8333e5f6bd20 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.225145632 +0000 UTC m=+1.931960420,LastTimestamp:2026-03-10 15:05:37.225145632 +0000 UTC m=+1.931960420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.520367 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8333e5f9b949 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.225341257 +0000 UTC m=+1.932156005,LastTimestamp:2026-03-10 15:05:37.225341257 +0000 UTC m=+1.932156005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.523485 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333e602a202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.225925122 +0000 UTC m=+1.932739870,LastTimestamp:2026-03-10 15:05:37.225925122 +0000 UTC m=+1.932739870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.527086 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b8333e6036d56 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.225977174 +0000 UTC m=+1.932791922,LastTimestamp:2026-03-10 15:05:37.225977174 +0000 UTC m=+1.932791922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.531145 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333f9d28941 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.558317377 +0000 UTC m=+2.265132125,LastTimestamp:2026-03-10 15:05:37.558317377 +0000 UTC m=+2.265132125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.535008 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333fac68b18 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.574308632 +0000 UTC m=+2.281123400,LastTimestamp:2026-03-10 15:05:37.574308632 +0000 UTC m=+2.281123400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.538656 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333fadd75ed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.575810541 +0000 UTC m=+2.282625299,LastTimestamp:2026-03-10 15:05:37.575810541 +0000 UTC m=+2.282625299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.542871 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833407b17218 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.791029784 +0000 UTC m=+2.497844542,LastTimestamp:2026-03-10 15:05:37.791029784 +0000 UTC m=+2.497844542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.546516 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b83340886475f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.804978015 +0000 UTC m=+2.511792773,LastTimestamp:2026-03-10 15:05:37.804978015 +0000 UTC m=+2.511792773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.550987 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8334089a48a3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.806289059 +0000 UTC m=+2.513103817,LastTimestamp:2026-03-10 15:05:37.806289059 +0000 UTC m=+2.513103817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.556605 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83341037d523 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.934054691 +0000 UTC m=+2.640869439,LastTimestamp:2026-03-10 15:05:37.934054691 +0000 UTC m=+2.640869439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.560713 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833410431dd9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.934794201 +0000 UTC m=+2.641608949,LastTimestamp:2026-03-10 15:05:37.934794201 +0000 UTC m=+2.641608949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.565197 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833410be8a50 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.942882896 +0000 UTC m=+2.649697644,LastTimestamp:2026-03-10 15:05:37.942882896 +0000 UTC m=+2.649697644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.569185 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b833410c82e4f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.943514703 +0000 UTC m=+2.650329451,LastTimestamp:2026-03-10 15:05:37.943514703 +0000 UTC m=+2.650329451,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.572866 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833414d13416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.01121487 +0000 UTC m=+2.718029618,LastTimestamp:2026-03-10 15:05:38.01121487 +0000 UTC m=+2.718029618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.577611 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833415eed67f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.029934207 +0000 UTC m=+2.736748955,LastTimestamp:2026-03-10 15:05:38.029934207 +0000 UTC m=+2.736748955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.581434 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b83341ca51c93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.142542995 +0000 UTC m=+2.849357743,LastTimestamp:2026-03-10 15:05:38.142542995 +0000 UTC m=+2.849357743,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.585453 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83341caca2ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.14303611 +0000 UTC m=+2.849850858,LastTimestamp:2026-03-10 15:05:38.14303611 +0000 UTC m=+2.849850858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.588924 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b83341cba6586 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.143937926 +0000 UTC m=+2.850752674,LastTimestamp:2026-03-10 15:05:38.143937926 +0000 UTC m=+2.850752674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.593400 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b83341ddd25e4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.162992612 +0000 UTC m=+2.869807360,LastTimestamp:2026-03-10 15:05:38.162992612 +0000 UTC m=+2.869807360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.596541 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83341e328800 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.168588288 +0000 UTC m=+2.875403036,LastTimestamp:2026-03-10 15:05:38.168588288 +0000 UTC m=+2.875403036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.599502 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b83341e352ec5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.168762053 +0000 UTC m=+2.875576801,LastTimestamp:2026-03-10 15:05:38.168762053 +0000 UTC m=+2.875576801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.602614 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83341e49a157 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.170102103 +0000 UTC m=+2.876916851,LastTimestamp:2026-03-10 15:05:38.170102103 +0000 UTC m=+2.876916851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.605905 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b83341e4ddf31 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.170380081 +0000 UTC m=+2.877194829,LastTimestamp:2026-03-10 15:05:38.170380081 +0000 UTC m=+2.877194829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.610413 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b83341e76a79e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.17305283 +0000 UTC m=+2.879867578,LastTimestamp:2026-03-10 15:05:38.17305283 +0000 UTC m=+2.879867578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.614575 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b83341fbc3e2d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.194390573 +0000 UTC m=+2.901205321,LastTimestamp:2026-03-10 15:05:38.194390573 +0000 UTC m=+2.901205321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.618737 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b83342abf5978 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.379143544 +0000 UTC m=+3.085958292,LastTimestamp:2026-03-10 15:05:38.379143544 +0000 UTC m=+3.085958292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.622552 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83342ae64baf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.381695919 +0000 UTC m=+3.088510667,LastTimestamp:2026-03-10 15:05:38.381695919 +0000 UTC m=+3.088510667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.626000 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b83342be4a9d4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.398366164 +0000 UTC m=+3.105180922,LastTimestamp:2026-03-10 15:05:38.398366164 +0000 UTC m=+3.105180922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.629796 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b83342bf49f43 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.399412035 +0000 UTC m=+3.106226783,LastTimestamp:2026-03-10 15:05:38.399412035 +0000 UTC m=+3.106226783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.633088 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83342c7a21c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.408161734 +0000 UTC m=+3.114976482,LastTimestamp:2026-03-10 15:05:38.408161734 +0000 UTC m=+3.114976482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.636420 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83342c8fc15f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.409578847 +0000 UTC m=+3.116393585,LastTimestamp:2026-03-10 15:05:38.409578847 +0000 UTC m=+3.116393585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.639670 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833436a3bbd8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.578660312 +0000 UTC m=+3.285475070,LastTimestamp:2026-03-10 15:05:38.578660312 +0000 UTC m=+3.285475070,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.644356 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833436a9e94c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.579065164 +0000 UTC m=+3.285879912,LastTimestamp:2026-03-10 15:05:38.579065164 +0000 UTC m=+3.285879912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.648206 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8334372bf5e3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.587588067 +0000 UTC m=+3.294402805,LastTimestamp:2026-03-10 15:05:38.587588067 +0000 UTC m=+3.294402805,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.651897 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8334378ac15d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.593800541 +0000 UTC m=+3.300615289,LastTimestamp:2026-03-10 15:05:38.593800541 +0000 UTC m=+3.300615289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.655700 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8334379c1407 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.594935815 +0000 UTC m=+3.301750563,LastTimestamp:2026-03-10 15:05:38.594935815 +0000 UTC m=+3.301750563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.659297 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8334418fa2e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.761892578 +0000 UTC m=+3.468707336,LastTimestamp:2026-03-10 15:05:38.761892578 +0000 UTC m=+3.468707336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.664298 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833442ff0e3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.785971772 +0000 UTC m=+3.492786520,LastTimestamp:2026-03-10 15:05:38.785971772 +0000 UTC m=+3.492786520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.668365 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83344312074b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.787215179 +0000 UTC m=+3.494029927,LastTimestamp:2026-03-10 15:05:38.787215179 +0000 UTC m=+3.494029927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.672966 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b83344d0325d5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.954012117 +0000 UTC m=+3.660826865,LastTimestamp:2026-03-10 15:05:38.954012117 +0000 UTC m=+3.660826865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.676499 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83344db878f1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.965895409 +0000 UTC m=+3.672710157,LastTimestamp:2026-03-10 15:05:38.965895409 +0000 UTC m=+3.672710157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.679879 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83344eca58de openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.983844062 +0000 UTC m=+3.690658820,LastTimestamp:2026-03-10 15:05:38.983844062 +0000 UTC m=+3.690658820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.683533 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b83345d90aee3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:39.231723235 +0000 UTC m=+3.938537983,LastTimestamp:2026-03-10 15:05:39.231723235 +0000 UTC m=+3.938537983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.688550 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b83345f456a7a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:39.260344954 +0000 UTC m=+3.967159702,LastTimestamp:2026-03-10 15:05:39.260344954 +0000 UTC m=+3.967159702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.692273 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b83348987bbd2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:39.969334226 +0000 UTC m=+4.676148984,LastTimestamp:2026-03-10 15:05:39.969334226 +0000 UTC m=+4.676148984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.698690 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b83344312074b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83344312074b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.787215179 +0000 UTC m=+3.494029927,LastTimestamp:2026-03-10 15:05:39.983689452 +0000 UTC m=+4.690504200,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.710441 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b83344db878f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83344db878f1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.965895409 +0000 UTC m=+3.672710157,LastTimestamp:2026-03-10 15:05:40.155317103 +0000 UTC m=+4.862131851,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.715715 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833494ab9ad2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.15623445 +0000 UTC m=+4.863049208,LastTimestamp:2026-03-10 15:05:40.15623445 +0000 UTC m=+4.863049208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.719746 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b83344eca58de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b83344eca58de openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:38.983844062 +0000 UTC m=+3.690658820,LastTimestamp:2026-03-10 15:05:40.165298529 +0000 UTC m=+4.872113277,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.723637 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334953f9578 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.165932408 +0000 UTC m=+4.872747146,LastTimestamp:2026-03-10 15:05:40.165932408 +0000 UTC m=+4.872747146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.727834 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b83349555d3e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.167390181 +0000 UTC m=+4.874204929,LastTimestamp:2026-03-10 15:05:40.167390181 +0000 UTC m=+4.874204929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.731991 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b83349f5e77ab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.335728555 +0000 UTC m=+5.042543303,LastTimestamp:2026-03-10 15:05:40.335728555 +0000 UTC m=+5.042543303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.735423 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334a027d318 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.348924696 +0000 UTC m=+5.055739444,LastTimestamp:2026-03-10 15:05:40.348924696 +0000 UTC m=+5.055739444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.739769 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334a03e054c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.35037934 +0000 UTC m=+5.057194088,LastTimestamp:2026-03-10 15:05:40.35037934 +0000 UTC m=+5.057194088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.743591 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334aab9ba8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.526258827 +0000 UTC m=+5.233073615,LastTimestamp:2026-03-10 15:05:40.526258827 +0000 UTC m=+5.233073615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.747359 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334ab7fc618 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.539237912 +0000 UTC m=+5.246052670,LastTimestamp:2026-03-10 15:05:40.539237912 +0000 UTC m=+5.246052670,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.750492 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334ab9331c4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.54051066 +0000 UTC m=+5.247325408,LastTimestamp:2026-03-10 15:05:40.54051066 +0000 UTC m=+5.247325408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.754509 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334b638518e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.719104398 +0000 UTC m=+5.425919146,LastTimestamp:2026-03-10 15:05:40.719104398 +0000 UTC m=+5.425919146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.758074 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334b6d923d6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.72964399 +0000 UTC m=+5.436458738,LastTimestamp:2026-03-10 15:05:40.72964399 +0000 UTC m=+5.436458738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.761794 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.761906 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334b6ed33a7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.730958759 +0000 UTC m=+5.437773507,LastTimestamp:2026-03-10 15:05:40.730958759 +0000 UTC m=+5.437773507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.765153 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334c3d87275 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.947702389 +0000 UTC m=+5.654517137,LastTimestamp:2026-03-10 15:05:40.947702389 +0000 UTC m=+5.654517137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.769202 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8334c47f4ff9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:40.958638073 +0000 UTC m=+5.665452821,LastTimestamp:2026-03-10 15:05:40.958638073 +0000 UTC m=+5.665452821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.774367 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.775883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.775915 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.775925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.775953 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.779781 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 15:06:10 crc kubenswrapper[4743]: &Event{ObjectMeta:{kube-apiserver-crc.189b8336f4f9b224 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 15:06:10 crc kubenswrapper[4743]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 15:06:10 crc kubenswrapper[4743]: Mar 10 15:06:10 crc kubenswrapper[4743]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:50.361899556 +0000 UTC m=+15.068714314,LastTimestamp:2026-03-10 15:05:50.361899556 +0000 UTC m=+15.068714314,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:06:10 crc kubenswrapper[4743]: > Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.779926 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.784888 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8336f4facb2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:50.361971498 +0000 UTC m=+15.068786256,LastTimestamp:2026-03-10 15:05:50.361971498 +0000 UTC m=+15.068786256,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.788833 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 15:06:10 crc kubenswrapper[4743]: &Event{ObjectMeta:{kube-apiserver-crc.189b8336f5372348 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 15:06:10 crc kubenswrapper[4743]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 10 15:06:10 crc kubenswrapper[4743]: Mar 10 15:06:10 crc kubenswrapper[4743]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:50.365926216 +0000 UTC m=+15.072741004,LastTimestamp:2026-03-10 15:05:50.365926216 +0000 UTC m=+15.072741004,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:06:10 crc kubenswrapper[4743]: > Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.792681 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b8336f4facb2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8336f4facb2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:50.361971498 +0000 UTC m=+15.068786256,LastTimestamp:2026-03-10 15:05:50.365990437 +0000 UTC m=+15.072805225,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.796860 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:06:10 crc kubenswrapper[4743]: &Event{ObjectMeta:{kube-controller-manager-crc.189b83373e1f2982 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 15:06:10 crc kubenswrapper[4743]: body: Mar 10 15:06:10 crc kubenswrapper[4743]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:51.589091714 +0000 UTC m=+16.295906462,LastTimestamp:2026-03-10 15:05:51.589091714 +0000 UTC m=+16.295906462,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:06:10 crc kubenswrapper[4743]: > Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.800399 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b83373e2044b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:51.589164216 +0000 UTC m=+16.295978964,LastTimestamp:2026-03-10 15:05:51.589164216 +0000 UTC m=+16.295978964,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.804977 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b83373e1f2982\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:06:10 crc kubenswrapper[4743]: &Event{ObjectMeta:{kube-controller-manager-crc.189b83373e1f2982 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 15:06:10 crc kubenswrapper[4743]: body: Mar 10 15:06:10 crc kubenswrapper[4743]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:51.589091714 +0000 UTC m=+16.295906462,LastTimestamp:2026-03-10 15:06:01.589225723 +0000 UTC m=+26.296040511,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:06:10 crc kubenswrapper[4743]: > Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.808256 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b83373e2044b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b83373e2044b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:51.589164216 +0000 UTC m=+16.295978964,LastTimestamp:2026-03-10 15:06:01.589302415 +0000 UTC m=+26.296117203,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.811454 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:06:10 crc kubenswrapper[4743]: &Event{ObjectMeta:{kube-controller-manager-crc.189b833b33635e73 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": dial tcp 192.168.126.11:10357: connect: connection refused Mar 10 15:06:10 crc kubenswrapper[4743]: body: Mar 10 15:06:10 crc kubenswrapper[4743]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.588881523 +0000 UTC m=+33.295696271,LastTimestamp:2026-03-10 15:06:08.588881523 +0000 UTC m=+33.295696271,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:06:10 crc kubenswrapper[4743]: > Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.815993 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b33647779 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.588953465 +0000 UTC m=+33.295768213,LastTimestamp:2026-03-10 15:06:08.588953465 +0000 UTC m=+33.295768213,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.819737 4743 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b3397658f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.592291215 +0000 UTC m=+33.299105993,LastTimestamp:2026-03-10 15:06:08.592291215 +0000 UTC m=+33.299105993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.823138 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b8333e602a202\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333e602a202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.225925122 +0000 UTC m=+1.932739870,LastTimestamp:2026-03-10 15:06:08.607647673 +0000 UTC m=+33.314462451,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.826333 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b8333f9d28941\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333f9d28941 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.558317377 +0000 UTC m=+2.265132125,LastTimestamp:2026-03-10 15:06:08.83656162 +0000 UTC m=+33.543376368,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: E0310 15:06:10.829349 4743 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b8333fac68b18\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8333fac68b18 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:05:37.574308632 +0000 UTC m=+2.281123400,LastTimestamp:2026-03-10 15:06:08.846221743 +0000 UTC m=+33.553036491,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.861167 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.914619 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.916179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.916223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.916236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:10 crc kubenswrapper[4743]: I0310 15:06:10.916829 4743 scope.go:117] "RemoveContainer" containerID="018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3" Mar 10 15:06:11 crc kubenswrapper[4743]: I0310 15:06:11.861737 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.091794 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.092897 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.094800 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6c59d4bb2eee32139f8aa6faa8f6e8a849789a617b7fb3409df369c4e24cac6c" exitCode=255 Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.094857 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6c59d4bb2eee32139f8aa6faa8f6e8a849789a617b7fb3409df369c4e24cac6c"} Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.094922 4743 scope.go:117] "RemoveContainer" containerID="018677ca2fb53e749b34ca7b48094501d4ab275249ad75df42850826fe6395a3" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.095073 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.096875 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.096924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.096942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.097922 4743 scope.go:117] "RemoveContainer" containerID="6c59d4bb2eee32139f8aa6faa8f6e8a849789a617b7fb3409df369c4e24cac6c" Mar 10 15:06:12 crc kubenswrapper[4743]: E0310 15:06:12.098251 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.728881 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.729123 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.730479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.730524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.730568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:12 crc kubenswrapper[4743]: I0310 15:06:12.861504 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:13 crc kubenswrapper[4743]: I0310 15:06:13.104046 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:06:13 crc kubenswrapper[4743]: I0310 15:06:13.860885 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:14 crc kubenswrapper[4743]: I0310 15:06:14.691043 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:06:14 crc kubenswrapper[4743]: I0310 15:06:14.714499 4743 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 15:06:14 crc kubenswrapper[4743]: I0310 15:06:14.864896 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:15 crc kubenswrapper[4743]: I0310 15:06:15.861872 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:16 crc kubenswrapper[4743]: E0310 15:06:16.010721 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:16 crc kubenswrapper[4743]: I0310 15:06:16.861284 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.417562 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.417758 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.418968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.418993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.419001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.419457 4743 scope.go:117] "RemoveContainer" containerID="6c59d4bb2eee32139f8aa6faa8f6e8a849789a617b7fb3409df369c4e24cac6c" Mar 10 15:06:17 crc kubenswrapper[4743]: E0310 15:06:17.419603 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:17 crc kubenswrapper[4743]: E0310 15:06:17.768150 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.780316 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.782080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.782245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.782354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.782724 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:17 crc kubenswrapper[4743]: E0310 15:06:17.786349 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.858356 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:17 crc kubenswrapper[4743]: I0310 15:06:17.965489 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.120197 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.121347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.121422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.121435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.122104 4743 scope.go:117] "RemoveContainer" containerID="6c59d4bb2eee32139f8aa6faa8f6e8a849789a617b7fb3409df369c4e24cac6c" Mar 10 15:06:18 crc kubenswrapper[4743]: E0310 15:06:18.122305 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.588460 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.588727 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.590212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.590243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.590251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:18 crc kubenswrapper[4743]: I0310 15:06:18.862532 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:19 crc kubenswrapper[4743]: I0310 15:06:19.184918 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:19 crc kubenswrapper[4743]: I0310 15:06:19.185086 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:19 crc kubenswrapper[4743]: I0310 15:06:19.188649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:19 crc kubenswrapper[4743]: I0310 15:06:19.188688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:19 crc kubenswrapper[4743]: I0310 15:06:19.188699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:19 crc kubenswrapper[4743]: I0310 15:06:19.189730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:19 crc kubenswrapper[4743]: I0310 15:06:19.860961 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:20 crc kubenswrapper[4743]: I0310 15:06:20.125566 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:20 crc kubenswrapper[4743]: I0310 15:06:20.126931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:20 crc kubenswrapper[4743]: I0310 15:06:20.127057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:20 crc kubenswrapper[4743]: I0310 15:06:20.127125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:20 crc kubenswrapper[4743]: I0310 15:06:20.862728 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:21 crc kubenswrapper[4743]: W0310 15:06:21.750362 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 15:06:21 crc kubenswrapper[4743]: E0310 15:06:21.750429 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 15:06:21 crc kubenswrapper[4743]: I0310 15:06:21.860284 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:22 crc kubenswrapper[4743]: I0310 15:06:22.864413 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:22 crc kubenswrapper[4743]: W0310 15:06:22.917333 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 15:06:22 crc kubenswrapper[4743]: E0310 15:06:22.917422 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 15:06:23 crc kubenswrapper[4743]: I0310 15:06:23.860910 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:24 crc kubenswrapper[4743]: E0310 15:06:24.772183 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:06:24 crc kubenswrapper[4743]: I0310 15:06:24.787284 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:24 crc kubenswrapper[4743]: I0310 15:06:24.788582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:24 crc kubenswrapper[4743]: I0310 15:06:24.788631 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:24 crc kubenswrapper[4743]: I0310 15:06:24.788643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:24 crc kubenswrapper[4743]: I0310 15:06:24.788669 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:24 crc kubenswrapper[4743]: E0310 15:06:24.793977 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:06:24 crc kubenswrapper[4743]: I0310 15:06:24.860434 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:25 crc kubenswrapper[4743]: W0310 15:06:25.314132 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:25 crc kubenswrapper[4743]: E0310 15:06:25.314218 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 15:06:25 crc kubenswrapper[4743]: W0310 15:06:25.722032 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 15:06:25 crc kubenswrapper[4743]: E0310 15:06:25.722113 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 15:06:25 crc kubenswrapper[4743]: I0310 15:06:25.862368 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:26 crc kubenswrapper[4743]: E0310 15:06:26.011007 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:26 crc kubenswrapper[4743]: I0310 15:06:26.305490 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:26 crc kubenswrapper[4743]: I0310 15:06:26.306289 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:26 crc kubenswrapper[4743]: I0310 15:06:26.308126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:26 crc kubenswrapper[4743]: I0310 15:06:26.308178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:26 crc kubenswrapper[4743]: I0310 15:06:26.308190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:26 crc kubenswrapper[4743]: I0310 15:06:26.861235 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:27 crc kubenswrapper[4743]: I0310 15:06:27.860646 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:28 crc kubenswrapper[4743]: I0310 15:06:28.860295 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:29 crc kubenswrapper[4743]: I0310 15:06:29.861889 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:30 crc kubenswrapper[4743]: I0310 15:06:30.861405 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:31 crc kubenswrapper[4743]: E0310 15:06:31.777194 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.795357 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.796495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.796526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.796538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.796562 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:31 crc kubenswrapper[4743]: E0310 15:06:31.801428 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.860269 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.914710 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.915742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.915788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.915806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:31 crc kubenswrapper[4743]: I0310 15:06:31.916493 4743 scope.go:117] "RemoveContainer" containerID="6c59d4bb2eee32139f8aa6faa8f6e8a849789a617b7fb3409df369c4e24cac6c" Mar 10 15:06:32 crc kubenswrapper[4743]: I0310 15:06:32.161706 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:06:32 crc kubenswrapper[4743]: I0310 15:06:32.163490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba"} Mar 10 15:06:32 crc kubenswrapper[4743]: I0310 15:06:32.163637 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:32 crc kubenswrapper[4743]: I0310 15:06:32.164457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:32 crc kubenswrapper[4743]: I0310 15:06:32.164509 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:32 crc kubenswrapper[4743]: I0310 15:06:32.164527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:32 crc kubenswrapper[4743]: I0310 15:06:32.861319 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.167797 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.168237 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.170381 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" exitCode=255 Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.170429 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba"} Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.170475 4743 scope.go:117] "RemoveContainer" containerID="6c59d4bb2eee32139f8aa6faa8f6e8a849789a617b7fb3409df369c4e24cac6c" Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.170681 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.171885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.171927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.171943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.172826 4743 scope.go:117] "RemoveContainer" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" Mar 10 15:06:33 crc kubenswrapper[4743]: E0310 15:06:33.173035 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:33 crc kubenswrapper[4743]: I0310 15:06:33.862119 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:34 crc kubenswrapper[4743]: I0310 15:06:34.176080 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:06:34 crc kubenswrapper[4743]: I0310 15:06:34.861636 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:35 crc kubenswrapper[4743]: I0310 15:06:35.860050 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:36 crc kubenswrapper[4743]: E0310 15:06:36.011218 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:36 crc kubenswrapper[4743]: I0310 15:06:36.861845 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:37 crc kubenswrapper[4743]: I0310 15:06:37.417684 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:37 crc kubenswrapper[4743]: I0310 15:06:37.417984 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:37 crc kubenswrapper[4743]: I0310 15:06:37.419747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:37 crc kubenswrapper[4743]: I0310 15:06:37.419825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:37 crc kubenswrapper[4743]: I0310 15:06:37.419839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:37 crc kubenswrapper[4743]: I0310 15:06:37.420650 4743 scope.go:117] "RemoveContainer" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" Mar 10 15:06:37 crc kubenswrapper[4743]: E0310 15:06:37.421022 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:37 crc kubenswrapper[4743]: I0310 15:06:37.860632 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:37 crc kubenswrapper[4743]: I0310 15:06:37.965318 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.189276 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.190609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.190697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.190726 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.192015 4743 scope.go:117] "RemoveContainer" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" Mar 10 15:06:38 crc kubenswrapper[4743]: E0310 15:06:38.192331 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:38 crc kubenswrapper[4743]: E0310 15:06:38.783492 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.801795 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.804471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.804524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.804535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.804564 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:38 crc kubenswrapper[4743]: E0310 15:06:38.816080 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:06:38 crc kubenswrapper[4743]: I0310 15:06:38.862609 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:39 crc kubenswrapper[4743]: I0310 15:06:39.860782 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:42 crc kubenswrapper[4743]: I0310 15:06:41.329578 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:42 crc kubenswrapper[4743]: I0310 15:06:42.052338 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:06:42 crc kubenswrapper[4743]: I0310 15:06:42.653559 4743 csr.go:261] certificate signing request csr-fdmrp is approved, waiting to be issued Mar 10 15:06:42 crc kubenswrapper[4743]: I0310 15:06:42.660247 4743 csr.go:257] certificate signing request csr-fdmrp is issued Mar 10 15:06:42 crc kubenswrapper[4743]: I0310 15:06:42.717289 4743 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 15:06:42 crc kubenswrapper[4743]: I0310 15:06:42.740284 4743 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 15:06:43 crc kubenswrapper[4743]: I0310 15:06:43.661315 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-18 07:39:07.568883587 +0000 UTC Mar 10 15:06:43 crc kubenswrapper[4743]: I0310 15:06:43.661805 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6784h32m23.907086061s for next certificate rotation Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.816705 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.819104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.819156 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.819173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.819458 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.828913 4743 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.829338 4743 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 15:06:45 crc kubenswrapper[4743]: E0310 15:06:45.829372 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.833272 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.833300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.833313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.833336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.833352 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:45Z","lastTransitionTime":"2026-03-10T15:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:45 crc kubenswrapper[4743]: E0310 15:06:45.846173 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.855392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.855449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.855464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.855488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.855910 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:45Z","lastTransitionTime":"2026-03-10T15:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:45 crc kubenswrapper[4743]: E0310 15:06:45.869769 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.880400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.880444 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.880464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.880485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.880499 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:45Z","lastTransitionTime":"2026-03-10T15:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:45 crc kubenswrapper[4743]: E0310 15:06:45.891674 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.899372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.899401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.899411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.899427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:45 crc kubenswrapper[4743]: I0310 15:06:45.899438 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:45Z","lastTransitionTime":"2026-03-10T15:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:45 crc kubenswrapper[4743]: E0310 15:06:45.915465 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:45 crc kubenswrapper[4743]: E0310 15:06:45.915635 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:06:45 crc kubenswrapper[4743]: E0310 15:06:45.915720 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.012249 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.015777 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.116475 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.218599 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.319079 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.419895 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.520294 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.620878 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.721152 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.821829 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:46 crc kubenswrapper[4743]: E0310 15:06:46.922770 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.023225 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.123385 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.223672 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.324880 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.426041 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.526631 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.627927 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.729112 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.830222 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:47 crc kubenswrapper[4743]: E0310 15:06:47.930441 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.031477 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.132351 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.233491 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.333633 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.434185 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.534378 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.635406 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.735843 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.837321 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: E0310 15:06:48.938361 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4743]: I0310 15:06:48.958672 4743 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.041320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.041379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.041399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.041423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.041436 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.143786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.143872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.143892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.143921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.143939 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.247688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.247867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.247890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.247939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.248157 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.329982 4743 apiserver.go:52] "Watching apiserver" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.341456 4743 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.342195 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.343056 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.343170 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.343191 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.343218 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.343893 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.343894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.344013 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.347240 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.347328 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.349758 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.349953 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.349964 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.350982 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.351002 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.351118 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.351366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.351401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.351413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.351430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.351444 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.351948 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.352113 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.352121 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.365705 4743 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366055 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366098 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366142 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366183 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366241 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366261 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366280 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366300 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366322 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366359 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366379 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366401 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366425 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366448 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366467 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366487 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366508 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366529 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366573 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366597 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366621 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366647 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366665 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366726 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366745 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366767 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366790 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366847 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366868 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366886 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366905 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366921 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366938 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366960 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.366997 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367017 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367035 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367053 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367071 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367147 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367265 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367288 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367307 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367327 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367346 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367363 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367380 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367398 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367414 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367433 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367487 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367505 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367539 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367576 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367594 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367610 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367626 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367656 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367690 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367741 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367756 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367776 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367792 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367821 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367855 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367871 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367887 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367921 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367940 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367957 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367973 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.367989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368006 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368022 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368058 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368088 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368169 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368188 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368220 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368236 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368252 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368267 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368284 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368299 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368318 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368340 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368350 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368356 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368364 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368419 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368453 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368477 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368497 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368516 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368533 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368549 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368590 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368611 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368629 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368773 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368891 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368929 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368950 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368969 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.368990 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369010 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369055 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369091 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369128 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369143 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369145 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369173 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369232 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369249 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369271 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369295 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369313 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369350 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369373 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369420 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369547 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369707 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369890 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.370003 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.370063 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.370337 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.370354 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.370672 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.370698 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.370854 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.371058 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.370864 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.371211 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.371543 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.371587 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.371642 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.371660 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.371752 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.371966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.372181 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.372242 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.372342 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.372447 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.372698 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.372723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.372827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373243 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373271 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373406 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373450 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373698 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373780 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.373848 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.374515 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.374558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.374590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.375133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.375148 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.375188 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.375442 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.375574 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.375693 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.375990 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.376057 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.376379 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.376611 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.376837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.376848 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.376958 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.377141 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.377592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.377620 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.378850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.379102 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.379300 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.379897 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.380030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.380098 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.380296 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.380622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.380776 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:06:49.880736067 +0000 UTC m=+74.587550855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.381019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.381656 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.382115 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.382457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.382871 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.383227 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.383702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.383746 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.383963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.384460 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.384488 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.384567 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.384725 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.384711 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.384897 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.384976 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.385219 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.385444 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.385481 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.385609 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.385648 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.385662 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.386003 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.386205 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.386322 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.386470 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.386577 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.386581 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.386875 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.386963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.387793 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.387827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.388249 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.388418 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.388509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.388482 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.388654 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389145 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.383263 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389195 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389417 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389624 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389587 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.389964 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.390032 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.390170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.390468 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.390491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.390525 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.390841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.390839 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.390895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391137 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391331 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.369432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391597 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391644 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391701 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391732 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391759 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391782 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391805 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391867 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391896 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391917 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391936 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391958 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391979 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.391999 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392024 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392052 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392077 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392141 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392165 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392187 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392239 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392268 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392315 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392338 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392357 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392376 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392410 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392427 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392530 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392531 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392548 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392566 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392582 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392632 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392682 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393138 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393354 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393433 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393632 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393660 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393683 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393703 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393722 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393742 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393764 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393784 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393804 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393920 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394033 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394113 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394176 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394200 4743 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394222 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394248 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394271 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394291 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394312 4743 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394332 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394352 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394371 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394395 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394417 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394439 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394459 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394478 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394495 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394514 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394532 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394555 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394572 4743 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394591 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394611 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394628 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394644 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394662 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394681 4743 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394699 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394718 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394735 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394754 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394776 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394799 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394853 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394875 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394894 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394913 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394931 4743 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394950 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394968 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.394989 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395005 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395022 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395040 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395063 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395079 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395095 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395115 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395134 4743 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395153 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395173 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395191 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395211 4743 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395231 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395248 4743 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.392966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393032 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395025 4743 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393331 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.393768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.394175 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395568 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.395669 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.396060 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.396583 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.396367 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.396713 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.396944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.396951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.397050 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.397188 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.397260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.397598 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.397830 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:49.89769201 +0000 UTC m=+74.604506768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.397923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.398290 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.398323 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.398362 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.398610 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.398982 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.398992 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.399059 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:49.899042299 +0000 UTC m=+74.605857047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.399090 4743 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.399231 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.399518 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.399594 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.399801 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.401037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.401275 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.401565 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.401660 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.401922 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402679 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402874 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402907 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402924 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402941 4743 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402954 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402969 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402984 4743 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.402998 4743 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403013 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403025 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403040 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403056 4743 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403069 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403049 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403082 4743 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403183 4743 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403231 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403245 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403257 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403285 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403297 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403308 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403328 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403340 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403367 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403378 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403389 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403398 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403409 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403420 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403446 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403456 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403467 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403477 4743 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403487 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403499 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403524 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403534 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403543 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403555 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403565 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403576 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403605 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403617 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403628 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403638 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403647 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403672 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403680 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403691 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403701 4743 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403711 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403721 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403746 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403757 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403766 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403775 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403784 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403794 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403805 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403848 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403858 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403869 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403879 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403889 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.403899 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.408399 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.409079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.413932 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.418203 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.420053 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.420163 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.420234 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.420364 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:49.920339978 +0000 UTC m=+74.627154726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.421361 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.421566 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.422055 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.422513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.423090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.423111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.423413 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.424487 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.424550 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.424828 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.426235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.428542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.429025 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.429050 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.429071 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.429146 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:49.929119743 +0000 UTC m=+74.635934701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.429298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.429982 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.430636 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.430676 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.431010 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.431424 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.432245 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.432802 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.433159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.434026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.434512 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.436301 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.436920 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.437222 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.447178 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.449718 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.453304 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.454576 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.455361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.455415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.455427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.455446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.455460 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.460045 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.464852 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.481042 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.504919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505074 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505117 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505186 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505200 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505212 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505322 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505338 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505357 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505369 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505381 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505392 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505405 4743 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505294 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505417 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505455 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505470 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505484 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505499 4743 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505511 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505524 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505538 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505551 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505563 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505575 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505590 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505606 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505621 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505634 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505648 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505662 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505675 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505687 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505700 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505712 4743 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505725 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505742 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505755 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505767 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505780 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505794 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505808 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505840 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505853 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505867 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505882 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505895 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505908 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.505921 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506009 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506095 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506151 4743 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506162 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506173 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506185 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506195 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506209 4743 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506224 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506235 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506246 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506257 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506268 4743 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506279 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.506289 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.558671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.558732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.558749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.558771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.558787 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.661003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.661036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.661045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.661060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.661069 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.663571 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.686829 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.687536 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.688616 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.700381 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:06:49 crc kubenswrapper[4743]: W0310 15:06:49.700676 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-c55956572b9012ae4b5a14d371244a60e43e4502239e3cab0e2c4bdabf2b5658 WatchSource:0}: Error finding container c55956572b9012ae4b5a14d371244a60e43e4502239e3cab0e2c4bdabf2b5658: Status 404 returned error can't find the container with id c55956572b9012ae4b5a14d371244a60e43e4502239e3cab0e2c4bdabf2b5658 Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.707232 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:06:49 crc kubenswrapper[4743]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 15:06:49 crc kubenswrapper[4743]: set -o allexport Mar 10 15:06:49 crc kubenswrapper[4743]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 15:06:49 crc kubenswrapper[4743]: source /etc/kubernetes/apiserver-url.env Mar 10 15:06:49 crc kubenswrapper[4743]: else Mar 10 15:06:49 crc kubenswrapper[4743]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 15:06:49 crc kubenswrapper[4743]: exit 1 Mar 10 15:06:49 crc kubenswrapper[4743]: fi Mar 10 15:06:49 crc kubenswrapper[4743]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 15:06:49 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:06:49 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.708626 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 15:06:49 crc kubenswrapper[4743]: W0310 15:06:49.710895 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5257088b905e15fe2cfef63cc8120dc0245df3783eea3b2781d0a9e46871e5c4 WatchSource:0}: Error finding container 5257088b905e15fe2cfef63cc8120dc0245df3783eea3b2781d0a9e46871e5c4: Status 404 returned error can't find the container with id 5257088b905e15fe2cfef63cc8120dc0245df3783eea3b2781d0a9e46871e5c4 Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.713464 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:06:49 crc kubenswrapper[4743]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:06:49 crc kubenswrapper[4743]: if [[ -f "/env/_master" ]]; then Mar 10 15:06:49 crc kubenswrapper[4743]: set -o allexport Mar 10 15:06:49 crc kubenswrapper[4743]: source "/env/_master" Mar 10 15:06:49 crc kubenswrapper[4743]: set +o allexport Mar 10 15:06:49 crc kubenswrapper[4743]: fi Mar 10 15:06:49 crc kubenswrapper[4743]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 15:06:49 crc kubenswrapper[4743]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 15:06:49 crc kubenswrapper[4743]: ho_enable="--enable-hybrid-overlay" Mar 10 15:06:49 crc kubenswrapper[4743]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 15:06:49 crc kubenswrapper[4743]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 15:06:49 crc kubenswrapper[4743]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 15:06:49 crc kubenswrapper[4743]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:06:49 crc kubenswrapper[4743]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 15:06:49 crc kubenswrapper[4743]: --webhook-host=127.0.0.1 \ Mar 10 15:06:49 crc kubenswrapper[4743]: --webhook-port=9743 \ Mar 10 15:06:49 crc kubenswrapper[4743]: ${ho_enable} \ Mar 10 15:06:49 crc kubenswrapper[4743]: --enable-interconnect \ Mar 10 15:06:49 crc kubenswrapper[4743]: --disable-approver \ Mar 10 15:06:49 crc kubenswrapper[4743]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 15:06:49 crc kubenswrapper[4743]: --wait-for-kubernetes-api=200s \ Mar 10 15:06:49 crc kubenswrapper[4743]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 15:06:49 crc kubenswrapper[4743]: --loglevel="${LOGLEVEL}" Mar 10 15:06:49 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:06:49 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.716013 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:06:49 crc kubenswrapper[4743]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:06:49 crc kubenswrapper[4743]: if [[ -f "/env/_master" ]]; then Mar 10 15:06:49 crc kubenswrapper[4743]: set -o allexport Mar 10 15:06:49 crc kubenswrapper[4743]: source "/env/_master" Mar 10 15:06:49 crc kubenswrapper[4743]: set +o allexport Mar 10 15:06:49 crc kubenswrapper[4743]: fi Mar 10 15:06:49 crc kubenswrapper[4743]: Mar 10 15:06:49 crc kubenswrapper[4743]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 15:06:49 crc kubenswrapper[4743]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:06:49 crc kubenswrapper[4743]: --disable-webhook \ Mar 10 15:06:49 crc kubenswrapper[4743]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 15:06:49 crc kubenswrapper[4743]: --loglevel="${LOGLEVEL}" Mar 10 15:06:49 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:06:49 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.717217 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.763729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.763775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.763790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.763828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.763842 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.869561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.869603 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.869615 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.869635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.869662 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.910966 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.911050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.911087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.911152 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:06:50.911116241 +0000 UTC m=+75.617931029 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.911172 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.911197 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.911224 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:50.911210674 +0000 UTC m=+75.618025412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.911239 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:50.911230004 +0000 UTC m=+75.618044752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.920225 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.922412 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.927696 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.930237 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.930386 4743 scope.go:117] "RemoveContainer" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" Mar 10 15:06:49 crc kubenswrapper[4743]: E0310 15:06:49.931680 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.931839 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.932773 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.934751 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.935711 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.938127 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.939038 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.940315 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.941262 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.941947 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.943364 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.944125 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.945941 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.947297 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.948062 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.949448 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.950332 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.951616 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.953207 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.953807 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.955344 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.955919 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.957649 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.959118 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.960968 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.962113 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.963907 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.964905 4743 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.965116 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.968576 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.971200 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.972309 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.972378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.972408 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.972436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.972451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.972460 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:49Z","lastTransitionTime":"2026-03-10T15:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.974834 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.976033 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.976543 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.977626 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.978349 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.979228 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.979825 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.980805 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.981406 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.982269 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.982791 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.983640 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.984393 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.985254 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.985699 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.986582 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.987086 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.987648 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.988469 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 15:06:49 crc kubenswrapper[4743]: I0310 15:06:49.988931 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.012202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.012268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.012413 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.012449 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.012465 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.012418 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.012545 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.012580 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.012524 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:51.012506788 +0000 UTC m=+75.719321536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.012655 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:51.012635531 +0000 UTC m=+75.719450279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.074336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.074385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.074396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.074413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.074423 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.177165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.177219 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.177233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.177251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.177264 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.280698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.280771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.280788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.280850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.280890 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.357703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5257088b905e15fe2cfef63cc8120dc0245df3783eea3b2781d0a9e46871e5c4"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.359579 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c55956572b9012ae4b5a14d371244a60e43e4502239e3cab0e2c4bdabf2b5658"} Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.360580 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:06:50 crc kubenswrapper[4743]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:06:50 crc kubenswrapper[4743]: if [[ -f "/env/_master" ]]; then Mar 10 15:06:50 crc kubenswrapper[4743]: set -o allexport Mar 10 15:06:50 crc kubenswrapper[4743]: source "/env/_master" Mar 10 15:06:50 crc kubenswrapper[4743]: set +o allexport Mar 10 15:06:50 crc kubenswrapper[4743]: fi Mar 10 15:06:50 crc kubenswrapper[4743]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 15:06:50 crc kubenswrapper[4743]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 15:06:50 crc kubenswrapper[4743]: ho_enable="--enable-hybrid-overlay" Mar 10 15:06:50 crc kubenswrapper[4743]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 15:06:50 crc kubenswrapper[4743]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 15:06:50 crc kubenswrapper[4743]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 15:06:50 crc kubenswrapper[4743]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:06:50 crc kubenswrapper[4743]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 15:06:50 crc kubenswrapper[4743]: --webhook-host=127.0.0.1 \ Mar 10 15:06:50 crc kubenswrapper[4743]: --webhook-port=9743 \ Mar 10 15:06:50 crc kubenswrapper[4743]: ${ho_enable} \ Mar 10 15:06:50 crc kubenswrapper[4743]: --enable-interconnect \ Mar 10 15:06:50 crc kubenswrapper[4743]: --disable-approver \ Mar 10 15:06:50 crc kubenswrapper[4743]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 15:06:50 crc kubenswrapper[4743]: --wait-for-kubernetes-api=200s \ Mar 10 15:06:50 crc kubenswrapper[4743]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 15:06:50 crc kubenswrapper[4743]: --loglevel="${LOGLEVEL}" Mar 10 15:06:50 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:06:50 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.361917 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:06:50 crc kubenswrapper[4743]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 15:06:50 crc kubenswrapper[4743]: set -o allexport Mar 10 15:06:50 crc kubenswrapper[4743]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 15:06:50 crc kubenswrapper[4743]: source /etc/kubernetes/apiserver-url.env Mar 10 15:06:50 crc kubenswrapper[4743]: else Mar 10 15:06:50 crc kubenswrapper[4743]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 15:06:50 crc kubenswrapper[4743]: exit 1 Mar 10 15:06:50 crc kubenswrapper[4743]: fi Mar 10 15:06:50 crc kubenswrapper[4743]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 15:06:50 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:06:50 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.361983 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8d08a2c9d33628f56bc7373f94510d28b1ab0af6efedc26b37e403a695c497e9"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.363027 4743 scope.go:117] "RemoveContainer" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.363061 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.363266 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.364005 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:06:50 crc kubenswrapper[4743]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:06:50 crc kubenswrapper[4743]: if [[ -f "/env/_master" ]]; then Mar 10 15:06:50 crc kubenswrapper[4743]: set -o allexport Mar 10 15:06:50 crc kubenswrapper[4743]: source "/env/_master" Mar 10 15:06:50 crc kubenswrapper[4743]: set +o allexport Mar 10 15:06:50 crc kubenswrapper[4743]: fi Mar 10 15:06:50 crc kubenswrapper[4743]: Mar 10 15:06:50 crc kubenswrapper[4743]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 15:06:50 crc kubenswrapper[4743]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:06:50 crc kubenswrapper[4743]: --disable-webhook \ Mar 10 15:06:50 crc kubenswrapper[4743]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 15:06:50 crc kubenswrapper[4743]: --loglevel="${LOGLEVEL}" Mar 10 15:06:50 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:06:50 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.365165 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.365866 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.366993 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.373195 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.383665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.383718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.383732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.383748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.383757 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.384328 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.400333 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.409784 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.419646 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.430016 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.439906 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.450096 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.459463 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.470104 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.479701 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.486598 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.486630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.486640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.486657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.486668 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.492439 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.503930 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.512922 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.589527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.589596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.589621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.589655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.589682 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.691686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.691727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.691737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.691752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.691761 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.794792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.794871 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.794888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.794911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.794928 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.898166 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.898262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.898279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.898303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.898321 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:50Z","lastTransitionTime":"2026-03-10T15:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.914469 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.914485 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.914570 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.914858 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.914993 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.915123 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.919909 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.919987 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:50 crc kubenswrapper[4743]: I0310 15:06:50.920025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.920109 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.920152 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:52.920137605 +0000 UTC m=+77.626952363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.920220 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.920454 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:06:52.920218038 +0000 UTC m=+77.627032816 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:06:50 crc kubenswrapper[4743]: E0310 15:06:50.920547 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:52.920528517 +0000 UTC m=+77.627343305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.001128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.001215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.001240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.001274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.001420 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.020530 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.020581 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:51 crc kubenswrapper[4743]: E0310 15:06:51.020692 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:51 crc kubenswrapper[4743]: E0310 15:06:51.020706 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:51 crc kubenswrapper[4743]: E0310 15:06:51.020716 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:51 crc kubenswrapper[4743]: E0310 15:06:51.020763 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:53.020749819 +0000 UTC m=+77.727564567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:51 crc kubenswrapper[4743]: E0310 15:06:51.020930 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:51 crc kubenswrapper[4743]: E0310 15:06:51.020981 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:51 crc kubenswrapper[4743]: E0310 15:06:51.021008 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:51 crc kubenswrapper[4743]: E0310 15:06:51.021994 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:53.021971955 +0000 UTC m=+77.728786713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.104149 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.104200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.104215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.104234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.104252 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.206484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.206518 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.206529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.206544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.206556 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.309536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.309617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.309649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.309679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.309699 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.413317 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.413441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.413516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.413552 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.413628 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.515975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.516046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.516068 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.516096 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.516117 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.618793 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.618890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.618908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.618933 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.618952 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.722958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.723033 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.723056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.723085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.723163 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.825713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.825793 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.825839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.825867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.825931 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.928249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.928297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.928312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.928332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:51 crc kubenswrapper[4743]: I0310 15:06:51.928348 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:51Z","lastTransitionTime":"2026-03-10T15:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.031136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.031175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.031184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.031198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.031211 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.133978 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.134132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.134159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.134188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.134214 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.236129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.236179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.236197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.236213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.236221 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.338205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.338239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.338248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.338263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.338315 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.441086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.441134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.441146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.441164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.441176 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.543958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.544000 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.544010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.544026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.544038 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.647443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.647496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.647511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.647535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.647547 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.749902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.749959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.749970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.749986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.749995 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.851777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.851849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.851862 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.851878 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.851886 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.914358 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.914425 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:52 crc kubenswrapper[4743]: E0310 15:06:52.914504 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.914431 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:52 crc kubenswrapper[4743]: E0310 15:06:52.914607 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:06:52 crc kubenswrapper[4743]: E0310 15:06:52.914732 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.936869 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.936955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.936991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:52 crc kubenswrapper[4743]: E0310 15:06:52.937073 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:52 crc kubenswrapper[4743]: E0310 15:06:52.937083 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:06:56.9370495 +0000 UTC m=+81.643864248 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:06:52 crc kubenswrapper[4743]: E0310 15:06:52.937131 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:56.937117862 +0000 UTC m=+81.643932610 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:52 crc kubenswrapper[4743]: E0310 15:06:52.937158 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:52 crc kubenswrapper[4743]: E0310 15:06:52.937487 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:56.937469332 +0000 UTC m=+81.644284160 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.954470 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.954515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.954525 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.954540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:52 crc kubenswrapper[4743]: I0310 15:06:52.954552 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:52Z","lastTransitionTime":"2026-03-10T15:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.037583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.037638 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:53 crc kubenswrapper[4743]: E0310 15:06:53.037755 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:53 crc kubenswrapper[4743]: E0310 15:06:53.037770 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:53 crc kubenswrapper[4743]: E0310 15:06:53.037781 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:53 crc kubenswrapper[4743]: E0310 15:06:53.037844 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:53 crc kubenswrapper[4743]: E0310 15:06:53.037890 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:53 crc kubenswrapper[4743]: E0310 15:06:53.037905 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:53 crc kubenswrapper[4743]: E0310 15:06:53.037852 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:57.037838899 +0000 UTC m=+81.744653647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:53 crc kubenswrapper[4743]: E0310 15:06:53.037991 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:06:57.037971663 +0000 UTC m=+81.744786491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.056422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.056474 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.056485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.056501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.056511 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.158119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.158163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.158172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.158187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.158198 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.260059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.260096 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.260106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.260138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.260148 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.362882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.362924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.362933 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.362948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.362956 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.465016 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.465059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.465069 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.465084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.465093 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.568781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.568867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.568878 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.568894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.568904 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.672317 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.672396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.672415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.672446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.672468 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.775796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.775934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.775960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.775993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.776013 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.879740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.879780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.879790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.879805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.879837 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.983526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.983629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.983647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.983668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:53 crc kubenswrapper[4743]: I0310 15:06:53.983680 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:53Z","lastTransitionTime":"2026-03-10T15:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.086664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.086720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.086737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.086761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.086781 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.190200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.190251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.190262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.190278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.190288 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.292997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.293036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.293044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.293063 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.293073 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.395537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.395573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.395582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.395595 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.395605 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.497776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.497908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.497929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.497959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.497975 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.601502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.601632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.601661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.601687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.601707 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.703902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.703940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.703950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.703964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.703973 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.806242 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.806305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.806322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.806352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.806369 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.908754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.908826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.908842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.908858 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.908868 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:54Z","lastTransitionTime":"2026-03-10T15:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.915183 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.915182 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:54 crc kubenswrapper[4743]: E0310 15:06:54.915276 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:06:54 crc kubenswrapper[4743]: I0310 15:06:54.915333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:54 crc kubenswrapper[4743]: E0310 15:06:54.915389 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:06:54 crc kubenswrapper[4743]: E0310 15:06:54.915529 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.010844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.010882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.010893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.010912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.010923 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.113935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.113968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.113978 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.113994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.114003 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.215920 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.215963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.215972 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.215985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.215994 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.318043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.318089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.318099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.318113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.318127 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.420001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.420047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.420059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.420076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.420086 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.522496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.522529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.522539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.522554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.522565 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.625086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.625128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.625142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.625159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.625170 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.727444 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.727487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.727500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.727519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.727532 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.830217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.830284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.830304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.830325 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.830340 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.926019 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.932345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.932386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.932395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.932410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.932424 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:55Z","lastTransitionTime":"2026-03-10T15:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.937443 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.949796 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.959268 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.969594 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.981001 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:55 crc kubenswrapper[4743]: I0310 15:06:55.993148 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.036196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.036248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.036260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.036279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.036293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.139232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.139278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.139290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.139306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.139318 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.217790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.217848 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.217859 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.217877 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.217890 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.227884 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.238231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.238272 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.238282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.238298 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.238308 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.247259 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.250485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.250522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.250535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.250551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.250562 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.259277 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.262008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.262064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.262081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.262097 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.262110 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.270551 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.273303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.273333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.273343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.273356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.273366 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.281202 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.281310 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.282625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.282653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.282662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.282676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.282685 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.385379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.385445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.385468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.385497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.385518 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.488508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.488596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.488605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.488618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.488626 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.591070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.591117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.591133 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.591153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.591170 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.694005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.694034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.694044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.694061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.694073 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.796636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.796693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.796709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.796732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.796749 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.899449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.899496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.899511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.899531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.899547 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:56Z","lastTransitionTime":"2026-03-10T15:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.914918 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.914919 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.915061 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.915227 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.915390 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.915512 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.973586 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.973664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.973797 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:04.973755944 +0000 UTC m=+89.680570742 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.973875 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.973975 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:04.97396102 +0000 UTC m=+89.680775778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:06:56 crc kubenswrapper[4743]: I0310 15:06:56.974019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.974226 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:56 crc kubenswrapper[4743]: E0310 15:06:56.974323 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:04.97430273 +0000 UTC m=+89.681117518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.002944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.003031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.003066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.003096 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.003118 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.074684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.074746 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:57 crc kubenswrapper[4743]: E0310 15:06:57.074913 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:57 crc kubenswrapper[4743]: E0310 15:06:57.074937 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:57 crc kubenswrapper[4743]: E0310 15:06:57.074951 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:57 crc kubenswrapper[4743]: E0310 15:06:57.075011 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:05.074994846 +0000 UTC m=+89.781809604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:57 crc kubenswrapper[4743]: E0310 15:06:57.075028 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:06:57 crc kubenswrapper[4743]: E0310 15:06:57.075078 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:06:57 crc kubenswrapper[4743]: E0310 15:06:57.075100 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:57 crc kubenswrapper[4743]: E0310 15:06:57.075215 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:05.075185362 +0000 UTC m=+89.782000150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.105562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.105656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.105695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.105730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.105755 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.208651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.208710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.208729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.208747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.208757 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.313614 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.313653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.313662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.313677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.313688 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.415671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.415723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.415735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.415755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.415767 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.517733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.517799 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.517847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.517868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.517882 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.619662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.619720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.619737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.619759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.619773 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.733573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.733629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.733643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.733661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.733680 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.836046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.836129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.836144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.836167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.836183 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.938295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.938372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.938387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.938433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:57 crc kubenswrapper[4743]: I0310 15:06:57.938448 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:57Z","lastTransitionTime":"2026-03-10T15:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.040501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.040554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.040570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.040592 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.040610 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.142859 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.142924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.142950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.142983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.143007 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.249369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.249427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.249440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.249458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.249475 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.353166 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.353230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.353251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.353298 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.353330 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.456252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.456326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.456344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.456370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.456389 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.558776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.558961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.558977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.558994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.559005 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.661551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.661837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.661876 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.661907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.661963 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.764225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.764262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.764270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.764284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.764293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.867007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.867066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.867105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.867142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.867164 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.915277 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.915277 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:06:58 crc kubenswrapper[4743]: E0310 15:06:58.915501 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.915283 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:06:58 crc kubenswrapper[4743]: E0310 15:06:58.915669 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:06:58 crc kubenswrapper[4743]: E0310 15:06:58.915871 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.970022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.970063 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.970076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.970093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:58 crc kubenswrapper[4743]: I0310 15:06:58.970104 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:58Z","lastTransitionTime":"2026-03-10T15:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.073027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.073147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.073161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.073176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.073187 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.175935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.176044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.176086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.176111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.176124 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.278964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.279015 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.279027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.279046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.279059 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.381654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.381710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.381721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.381738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.381752 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.484590 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.484658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.484677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.484701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.484717 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.587084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.587165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.587181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.587203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.587219 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.690136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.690194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.690210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.690229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.690242 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.792771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.792885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.792899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.792926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.792941 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.895701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.895800 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.895844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.895869 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.895882 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.998206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.998269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.998281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.998297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:06:59 crc kubenswrapper[4743]: I0310 15:06:59.998308 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:06:59Z","lastTransitionTime":"2026-03-10T15:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.100996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.101056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.101073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.101098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.101116 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.203931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.203968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.203976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.203990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.203999 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.305710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.305750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.305758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.305772 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.305781 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.407895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.407931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.407942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.407957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.407968 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.509682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.509722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.509731 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.509745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.509755 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.612082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.612137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.612155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.612177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.612193 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.714512 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.714555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.714568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.714585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.714597 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.817321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.817374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.817390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.817411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.817426 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.915167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.915247 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:00 crc kubenswrapper[4743]: E0310 15:07:00.915487 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.915555 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:00 crc kubenswrapper[4743]: E0310 15:07:00.915736 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:00 crc kubenswrapper[4743]: E0310 15:07:00.915903 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:00 crc kubenswrapper[4743]: E0310 15:07:00.918556 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:00 crc kubenswrapper[4743]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:07:00 crc kubenswrapper[4743]: if [[ -f "/env/_master" ]]; then Mar 10 15:07:00 crc kubenswrapper[4743]: set -o allexport Mar 10 15:07:00 crc kubenswrapper[4743]: source "/env/_master" Mar 10 15:07:00 crc kubenswrapper[4743]: set +o allexport Mar 10 15:07:00 crc kubenswrapper[4743]: fi Mar 10 15:07:00 crc kubenswrapper[4743]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 15:07:00 crc kubenswrapper[4743]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 15:07:00 crc kubenswrapper[4743]: ho_enable="--enable-hybrid-overlay" Mar 10 15:07:00 crc kubenswrapper[4743]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 15:07:00 crc kubenswrapper[4743]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 15:07:00 crc kubenswrapper[4743]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 15:07:00 crc kubenswrapper[4743]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:07:00 crc kubenswrapper[4743]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 15:07:00 crc kubenswrapper[4743]: --webhook-host=127.0.0.1 \ Mar 10 15:07:00 crc kubenswrapper[4743]: --webhook-port=9743 \ Mar 10 15:07:00 crc kubenswrapper[4743]: ${ho_enable} \ Mar 10 15:07:00 crc kubenswrapper[4743]: --enable-interconnect \ Mar 10 15:07:00 crc kubenswrapper[4743]: --disable-approver \ Mar 10 15:07:00 crc kubenswrapper[4743]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 15:07:00 crc kubenswrapper[4743]: --wait-for-kubernetes-api=200s \ Mar 10 15:07:00 crc kubenswrapper[4743]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 15:07:00 crc kubenswrapper[4743]: --loglevel="${LOGLEVEL}" Mar 10 15:07:00 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:00 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:00 crc kubenswrapper[4743]: E0310 15:07:00.918695 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:00 crc kubenswrapper[4743]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 15:07:00 crc kubenswrapper[4743]: set -o allexport Mar 10 15:07:00 crc kubenswrapper[4743]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 15:07:00 crc kubenswrapper[4743]: source /etc/kubernetes/apiserver-url.env Mar 10 15:07:00 crc kubenswrapper[4743]: else Mar 10 15:07:00 crc kubenswrapper[4743]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 15:07:00 crc kubenswrapper[4743]: exit 1 Mar 10 15:07:00 crc kubenswrapper[4743]: fi Mar 10 15:07:00 crc kubenswrapper[4743]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 15:07:00 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:00 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.919523 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.919540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.919550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.919567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:00 crc kubenswrapper[4743]: I0310 15:07:00.919580 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:00Z","lastTransitionTime":"2026-03-10T15:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:00 crc kubenswrapper[4743]: E0310 15:07:00.919797 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 15:07:00 crc kubenswrapper[4743]: E0310 15:07:00.921074 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:00 crc kubenswrapper[4743]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:07:00 crc kubenswrapper[4743]: if [[ -f "/env/_master" ]]; then Mar 10 15:07:00 crc kubenswrapper[4743]: set -o allexport Mar 10 15:07:00 crc kubenswrapper[4743]: source "/env/_master" Mar 10 15:07:00 crc kubenswrapper[4743]: set +o allexport Mar 10 15:07:00 crc kubenswrapper[4743]: fi Mar 10 15:07:00 crc kubenswrapper[4743]: Mar 10 15:07:00 crc kubenswrapper[4743]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 15:07:00 crc kubenswrapper[4743]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:07:00 crc kubenswrapper[4743]: --disable-webhook \ Mar 10 15:07:00 crc kubenswrapper[4743]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 15:07:00 crc kubenswrapper[4743]: --loglevel="${LOGLEVEL}" Mar 10 15:07:00 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:00 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:00 crc kubenswrapper[4743]: E0310 15:07:00.922290 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.022264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.022306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.022316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.022331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.022342 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.125573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.125618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.125627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.125643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.125654 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.227967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.228032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.228047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.228067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.228079 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.331301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.331355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.331367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.331402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.331415 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.353948 4743 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.433838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.433884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.433900 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.433917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.433928 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.536643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.536693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.536702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.536717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.536728 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.639790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.639892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.639904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.639925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.639940 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.743286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.743334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.743345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.743362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.743375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.845663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.845708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.845721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.845739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.845752 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.916195 4743 scope.go:117] "RemoveContainer" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" Mar 10 15:07:01 crc kubenswrapper[4743]: E0310 15:07:01.916579 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.948138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.948182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.948193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.948212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:01 crc kubenswrapper[4743]: I0310 15:07:01.948226 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:01Z","lastTransitionTime":"2026-03-10T15:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.053140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.053188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.053197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.053215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.053227 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.156210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.156267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.156278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.156296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.156307 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.258561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.258612 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.258633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.258659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.258677 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.360982 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.361023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.361030 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.361044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.361053 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.462756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.462835 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.462853 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.462886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.462908 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.564538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.564594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.564603 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.564621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.564633 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.666941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.666987 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.666998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.667012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.667022 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.768925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.768997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.769009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.769024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.769035 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.871083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.871130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.871142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.871159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.871173 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.914675 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.914706 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.914802 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:02 crc kubenswrapper[4743]: E0310 15:07:02.914925 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:02 crc kubenswrapper[4743]: E0310 15:07:02.915085 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:02 crc kubenswrapper[4743]: E0310 15:07:02.915169 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.973558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.973632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.973659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.973684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:02 crc kubenswrapper[4743]: I0310 15:07:02.973701 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:02Z","lastTransitionTime":"2026-03-10T15:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.040121 4743 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.076252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.076304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.076322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.076348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.076366 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.178339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.178424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.178440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.178459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.178473 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.281164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.281206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.281216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.281264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.281283 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.383787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.383847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.383860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.383884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.383896 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.486140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.486173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.486181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.486195 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.486204 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.588606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.588647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.588656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.588672 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.588682 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.691228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.691274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.691283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.691300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.691309 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.794319 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.794420 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.794432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.794447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.794457 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.897961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.898002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.898011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.898025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:03 crc kubenswrapper[4743]: I0310 15:07:03.898037 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:03Z","lastTransitionTime":"2026-03-10T15:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.000404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.000441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.000450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.000465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.000480 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.103212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.103243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.103251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.103265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.103275 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.206418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.206458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.206476 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.206497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.206513 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.309587 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.309632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.309642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.309658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.309667 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.411489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.411551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.411570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.411594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.411612 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.514837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.514882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.514903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.514921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.514930 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.617459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.617513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.617524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.617549 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.617562 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.734338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.734422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.734442 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.734465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.734483 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.836684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.836718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.836727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.836740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.836749 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.915087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:04 crc kubenswrapper[4743]: E0310 15:07:04.915453 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.916080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.916149 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:04 crc kubenswrapper[4743]: E0310 15:07:04.916259 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:04 crc kubenswrapper[4743]: E0310 15:07:04.916645 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:04 crc kubenswrapper[4743]: E0310 15:07:04.918534 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:04 crc kubenswrapper[4743]: E0310 15:07:04.919948 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.941540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.941593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.941609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.941631 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:04 crc kubenswrapper[4743]: I0310 15:07:04.941649 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:04Z","lastTransitionTime":"2026-03-10T15:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.043438 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.043644 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.043717 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:21.043678881 +0000 UTC m=+105.750493639 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.043800 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.043877 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.043953 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:21.043912268 +0000 UTC m=+105.750727056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.044022 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.044077 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:21.044063042 +0000 UTC m=+105.750878010 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.046205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.046283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.046309 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.046390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.046457 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.145213 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.145273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.145406 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.145423 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.145434 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.145480 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:21.145464809 +0000 UTC m=+105.852279557 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.145510 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.145575 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.145596 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:05 crc kubenswrapper[4743]: E0310 15:07:05.145689 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:21.145662235 +0000 UTC m=+105.852476993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.149746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.149777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.149789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.149839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.149852 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.252226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.252256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.252266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.252281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.252293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.355578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.355629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.355641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.355662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.355679 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.458924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.458970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.458980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.459004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.459018 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.562597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.562640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.562650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.562668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.562678 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.665868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.665951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.665970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.665999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.666020 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.769623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.769684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.769701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.769721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.769737 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.872556 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.872595 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.872603 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.872618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.872628 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.927854 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.941284 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.952430 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.961325 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.971850 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.974692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.974736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.974746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.974762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.974773 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:05Z","lastTransitionTime":"2026-03-10T15:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.987432 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:05 crc kubenswrapper[4743]: I0310 15:07:05.996619 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.076587 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.076635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.076647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.076665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.076678 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.180088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.180143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.180155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.180173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.180186 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.283150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.283233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.283247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.283273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.283292 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.386523 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.386581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.386591 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.386609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.386620 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.489320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.489364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.489374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.489397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.489412 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.591860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.591894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.591904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.591922 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.591934 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.598430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.598489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.598509 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.598532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.598551 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.615718 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.623903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.623977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.624005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.624032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.624053 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.642713 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.647089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.647152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.647170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.647193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.647213 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.660195 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.664412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.664472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.664637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.664692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.664717 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.679490 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.684404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.684453 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.684469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.684496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.684513 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.696152 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.696329 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.698666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.698717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.698736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.698761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.698779 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.802695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.802777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.802796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.802855 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.802875 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.906127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.906192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.906215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.906249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.906272 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:06Z","lastTransitionTime":"2026-03-10T15:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.914587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.914617 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:06 crc kubenswrapper[4743]: I0310 15:07:06.914590 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.914782 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.914901 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:06 crc kubenswrapper[4743]: E0310 15:07:06.915008 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.009261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.009316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.009330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.009353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.009367 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.112367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.112405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.112416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.112433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.112445 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.214721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.214764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.214781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.214805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.214854 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.318485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.318532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.318550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.318574 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.318595 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.421771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.421871 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.421892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.421916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.421932 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.525183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.525239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.525252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.525275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.525287 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.629190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.629246 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.629258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.629282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.629298 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.731361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.731413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.731422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.731439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.731448 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.833845 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.833935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.833950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.833969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.834019 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.936543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.936613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.936635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.936670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:07 crc kubenswrapper[4743]: I0310 15:07:07.936692 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:07Z","lastTransitionTime":"2026-03-10T15:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.039647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.039696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.039705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.039727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.039737 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.141752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.141792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.141802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.141838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.141851 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.244791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.244865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.244876 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.244897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.244909 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.347729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.347791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.347808 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.347873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.347889 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.451223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.451290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.451303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.451321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.451335 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.553972 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.554035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.554045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.554064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.554077 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.657145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.657224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.657245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.657274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.657294 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.760018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.760085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.760094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.760110 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.760139 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.864232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.864330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.864346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.864370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.864391 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.914655 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.914730 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:08 crc kubenswrapper[4743]: E0310 15:07:08.914928 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.914992 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:08 crc kubenswrapper[4743]: E0310 15:07:08.915063 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:08 crc kubenswrapper[4743]: E0310 15:07:08.915182 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.926366 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.967900 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.967946 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.967957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.967979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:08 crc kubenswrapper[4743]: I0310 15:07:08.967996 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:08Z","lastTransitionTime":"2026-03-10T15:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.071699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.071762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.071775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.071798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.071828 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.174909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.174950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.174962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.174978 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.174990 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.278117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.278159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.278170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.278207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.278221 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.381298 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.381354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.381362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.381376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.381385 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.484704 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.484762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.484780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.484835 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.484863 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.587594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.587638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.587651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.587675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.587692 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.691142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.691203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.691217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.691239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.691253 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.793731 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.793787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.793799 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.793840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.793854 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.897197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.897243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.897255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.897273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:09 crc kubenswrapper[4743]: I0310 15:07:09.897284 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:09Z","lastTransitionTime":"2026-03-10T15:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.000673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.000729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.000739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.000760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.000771 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.103378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.103434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.103448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.103467 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.103478 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.206983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.207055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.207076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.207106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.207126 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.310206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.310268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.310290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.310313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.310328 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.413395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.413470 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.413490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.413516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.413533 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.516785 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.516849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.516859 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.516875 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.516884 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.561632 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t9mrg"] Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.562052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t9mrg" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.564875 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.566093 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.566192 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.576903 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.589019 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.602474 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.617037 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.620504 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.620571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.620581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.620597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.620606 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.630399 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.644989 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.657453 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.664829 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.674250 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.701725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d36817c-a2b7-49c1-92a9-2f9c54fd4f97-hosts-file\") pod \"node-resolver-t9mrg\" (UID: \"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\") " pod="openshift-dns/node-resolver-t9mrg" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.701789 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjc2x\" (UniqueName: \"kubernetes.io/projected/5d36817c-a2b7-49c1-92a9-2f9c54fd4f97-kube-api-access-qjc2x\") pod \"node-resolver-t9mrg\" (UID: \"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\") " pod="openshift-dns/node-resolver-t9mrg" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.723117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.723188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.723212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.723238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.723259 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.802346 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjc2x\" (UniqueName: \"kubernetes.io/projected/5d36817c-a2b7-49c1-92a9-2f9c54fd4f97-kube-api-access-qjc2x\") pod \"node-resolver-t9mrg\" (UID: \"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\") " pod="openshift-dns/node-resolver-t9mrg" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.802409 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d36817c-a2b7-49c1-92a9-2f9c54fd4f97-hosts-file\") pod \"node-resolver-t9mrg\" (UID: \"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\") " pod="openshift-dns/node-resolver-t9mrg" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.802477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d36817c-a2b7-49c1-92a9-2f9c54fd4f97-hosts-file\") pod \"node-resolver-t9mrg\" (UID: \"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\") " pod="openshift-dns/node-resolver-t9mrg" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.820479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjc2x\" (UniqueName: \"kubernetes.io/projected/5d36817c-a2b7-49c1-92a9-2f9c54fd4f97-kube-api-access-qjc2x\") pod \"node-resolver-t9mrg\" (UID: \"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\") " pod="openshift-dns/node-resolver-t9mrg" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.825504 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.826127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.826146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.826165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.826177 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.882302 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t9mrg" Mar 10 15:07:10 crc kubenswrapper[4743]: W0310 15:07:10.896029 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d36817c_a2b7_49c1_92a9_2f9c54fd4f97.slice/crio-54431130abdcf743573ec00161a0045dda0377db2d925d01db36a1c16a252fdc WatchSource:0}: Error finding container 54431130abdcf743573ec00161a0045dda0377db2d925d01db36a1c16a252fdc: Status 404 returned error can't find the container with id 54431130abdcf743573ec00161a0045dda0377db2d925d01db36a1c16a252fdc Mar 10 15:07:10 crc kubenswrapper[4743]: E0310 15:07:10.898284 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:10 crc kubenswrapper[4743]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 15:07:10 crc kubenswrapper[4743]: set -uo pipefail Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 15:07:10 crc kubenswrapper[4743]: HOSTS_FILE="/etc/hosts" Mar 10 15:07:10 crc kubenswrapper[4743]: TEMP_FILE="/etc/hosts.tmp" Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: # Make a temporary file with the old hosts file's attributes. Mar 10 15:07:10 crc kubenswrapper[4743]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 15:07:10 crc kubenswrapper[4743]: echo "Failed to preserve hosts file. Exiting." Mar 10 15:07:10 crc kubenswrapper[4743]: exit 1 Mar 10 15:07:10 crc kubenswrapper[4743]: fi Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: while true; do Mar 10 15:07:10 crc kubenswrapper[4743]: declare -A svc_ips Mar 10 15:07:10 crc kubenswrapper[4743]: for svc in "${services[@]}"; do Mar 10 15:07:10 crc kubenswrapper[4743]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 15:07:10 crc kubenswrapper[4743]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 15:07:10 crc kubenswrapper[4743]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 15:07:10 crc kubenswrapper[4743]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 15:07:10 crc kubenswrapper[4743]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 15:07:10 crc kubenswrapper[4743]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 15:07:10 crc kubenswrapper[4743]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 15:07:10 crc kubenswrapper[4743]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 15:07:10 crc kubenswrapper[4743]: for i in ${!cmds[*]} Mar 10 15:07:10 crc kubenswrapper[4743]: do Mar 10 15:07:10 crc kubenswrapper[4743]: ips=($(eval "${cmds[i]}")) Mar 10 15:07:10 crc kubenswrapper[4743]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 15:07:10 crc kubenswrapper[4743]: svc_ips["${svc}"]="${ips[@]}" Mar 10 15:07:10 crc kubenswrapper[4743]: break Mar 10 15:07:10 crc kubenswrapper[4743]: fi Mar 10 15:07:10 crc kubenswrapper[4743]: done Mar 10 15:07:10 crc kubenswrapper[4743]: done Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: # Update /etc/hosts only if we get valid service IPs Mar 10 15:07:10 crc kubenswrapper[4743]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 15:07:10 crc kubenswrapper[4743]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 15:07:10 crc kubenswrapper[4743]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 15:07:10 crc kubenswrapper[4743]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 15:07:10 crc kubenswrapper[4743]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 15:07:10 crc kubenswrapper[4743]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 15:07:10 crc kubenswrapper[4743]: sleep 60 & wait Mar 10 15:07:10 crc kubenswrapper[4743]: continue Mar 10 15:07:10 crc kubenswrapper[4743]: fi Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: # Append resolver entries for services Mar 10 15:07:10 crc kubenswrapper[4743]: rc=0 Mar 10 15:07:10 crc kubenswrapper[4743]: for svc in "${!svc_ips[@]}"; do Mar 10 15:07:10 crc kubenswrapper[4743]: for ip in ${svc_ips[${svc}]}; do Mar 10 15:07:10 crc kubenswrapper[4743]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 15:07:10 crc kubenswrapper[4743]: done Mar 10 15:07:10 crc kubenswrapper[4743]: done Mar 10 15:07:10 crc kubenswrapper[4743]: if [[ $rc -ne 0 ]]; then Mar 10 15:07:10 crc kubenswrapper[4743]: sleep 60 & wait Mar 10 15:07:10 crc kubenswrapper[4743]: continue Mar 10 15:07:10 crc kubenswrapper[4743]: fi Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: Mar 10 15:07:10 crc kubenswrapper[4743]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 15:07:10 crc kubenswrapper[4743]: # Replace /etc/hosts with our modified version if needed Mar 10 15:07:10 crc kubenswrapper[4743]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 15:07:10 crc kubenswrapper[4743]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 15:07:10 crc kubenswrapper[4743]: fi Mar 10 15:07:10 crc kubenswrapper[4743]: sleep 60 & wait Mar 10 15:07:10 crc kubenswrapper[4743]: unset svc_ips Mar 10 15:07:10 crc kubenswrapper[4743]: done Mar 10 15:07:10 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjc2x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-t9mrg_openshift-dns(5d36817c-a2b7-49c1-92a9-2f9c54fd4f97): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:10 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:10 crc kubenswrapper[4743]: E0310 15:07:10.899459 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-t9mrg" podUID="5d36817c-a2b7-49c1-92a9-2f9c54fd4f97" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.912456 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qrnln"] Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.917042 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.917073 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.917068 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.917097 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:10 crc kubenswrapper[4743]: E0310 15:07:10.917244 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:10 crc kubenswrapper[4743]: E0310 15:07:10.918022 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:10 crc kubenswrapper[4743]: E0310 15:07:10.918282 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.924510 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.924722 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.924914 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.925043 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-s46xz"] Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.925148 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.925357 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.926048 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.928758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.928789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.928802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.928840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.928855 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:10Z","lastTransitionTime":"2026-03-10T15:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.929343 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vgbfn"] Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.929452 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.929723 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.929835 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.929930 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.930076 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vgbfn" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.930513 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.934387 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.934512 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.936753 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.949967 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.964133 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.974412 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.983160 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:10 crc kubenswrapper[4743]: I0310 15:07:10.994481 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.003724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d049bbf-95c6-4135-8808-1e453cf59a07-proxy-tls\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.003975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbln8\" (UniqueName: \"kubernetes.io/projected/1d049bbf-95c6-4135-8808-1e453cf59a07-kube-api-access-jbln8\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.004061 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1d049bbf-95c6-4135-8808-1e453cf59a07-rootfs\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.004141 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d049bbf-95c6-4135-8808-1e453cf59a07-mcd-auth-proxy-config\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.005585 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.017277 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.028419 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.031379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.031415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.031431 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.031477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.031492 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.038806 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.049882 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.059392 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.068428 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.079716 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.090879 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.102588 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.104890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md2gq\" (UniqueName: \"kubernetes.io/projected/32b91cde-a621-4d27-a253-12a8effb3b0b-kube-api-access-md2gq\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.105035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1736aae6-d840-4b31-8c44-6637a05f37ef-cni-binary-copy\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.105160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-socket-dir-parent\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.105288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-system-cni-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.105441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d049bbf-95c6-4135-8808-1e453cf59a07-proxy-tls\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.105549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-etc-kubernetes\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.105658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32b91cde-a621-4d27-a253-12a8effb3b0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.105781 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-cni-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.105986 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6g99\" (UniqueName: \"kubernetes.io/projected/1736aae6-d840-4b31-8c44-6637a05f37ef-kube-api-access-g6g99\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106105 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-k8s-cni-cncf-io\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106346 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1d049bbf-95c6-4135-8808-1e453cf59a07-rootfs\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106467 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-os-release\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106596 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-cni-multus\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-multus-certs\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32b91cde-a621-4d27-a253-12a8effb3b0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106997 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-kubelet\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.107135 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-conf-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.107265 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-system-cni-dir\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.107383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-cni-bin\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.107493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-daemon-config\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.107617 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbln8\" (UniqueName: \"kubernetes.io/projected/1d049bbf-95c6-4135-8808-1e453cf59a07-kube-api-access-jbln8\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.107741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-cnibin\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.107885 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-os-release\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.108008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d049bbf-95c6-4135-8808-1e453cf59a07-mcd-auth-proxy-config\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.108118 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-hostroot\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.108228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-cnibin\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.108346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-netns\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.106409 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1d049bbf-95c6-4135-8808-1e453cf59a07-rootfs\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.108716 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d049bbf-95c6-4135-8808-1e453cf59a07-mcd-auth-proxy-config\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.113440 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.116220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d049bbf-95c6-4135-8808-1e453cf59a07-proxy-tls\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.123677 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.125634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbln8\" (UniqueName: \"kubernetes.io/projected/1d049bbf-95c6-4135-8808-1e453cf59a07-kube-api-access-jbln8\") pod \"machine-config-daemon-qrnln\" (UID: \"1d049bbf-95c6-4135-8808-1e453cf59a07\") " pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.134006 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.134617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.134675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.134685 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.134703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.134714 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.145138 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.154133 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.165519 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md2gq\" (UniqueName: \"kubernetes.io/projected/32b91cde-a621-4d27-a253-12a8effb3b0b-kube-api-access-md2gq\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1736aae6-d840-4b31-8c44-6637a05f37ef-cni-binary-copy\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-socket-dir-parent\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-system-cni-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209284 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-etc-kubernetes\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32b91cde-a621-4d27-a253-12a8effb3b0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-cni-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6g99\" (UniqueName: \"kubernetes.io/projected/1736aae6-d840-4b31-8c44-6637a05f37ef-kube-api-access-g6g99\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209463 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-k8s-cni-cncf-io\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209488 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-os-release\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-cni-multus\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-multus-certs\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32b91cde-a621-4d27-a253-12a8effb3b0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209592 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-kubelet\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209637 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-conf-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-system-cni-dir\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209768 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-cni-bin\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209794 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-daemon-config\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209874 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-cnibin\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209876 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-socket-dir-parent\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-os-release\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-hostroot\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-cnibin\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209972 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-system-cni-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209980 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-netns\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-cni-multus\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-netns\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-system-cni-dir\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210141 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-k8s-cni-cncf-io\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-conf-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.209788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-etc-kubernetes\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210234 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-os-release\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-hostroot\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-os-release\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-kubelet\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-cnibin\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-var-lib-cni-bin\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210360 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-cnibin\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-cni-dir\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1736aae6-d840-4b31-8c44-6637a05f37ef-host-run-multus-certs\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32b91cde-a621-4d27-a253-12a8effb3b0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.210773 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1736aae6-d840-4b31-8c44-6637a05f37ef-cni-binary-copy\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.211342 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32b91cde-a621-4d27-a253-12a8effb3b0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.211680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1736aae6-d840-4b31-8c44-6637a05f37ef-multus-daemon-config\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.211988 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32b91cde-a621-4d27-a253-12a8effb3b0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.231968 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md2gq\" (UniqueName: \"kubernetes.io/projected/32b91cde-a621-4d27-a253-12a8effb3b0b-kube-api-access-md2gq\") pod \"multus-additional-cni-plugins-s46xz\" (UID: \"32b91cde-a621-4d27-a253-12a8effb3b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.234236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6g99\" (UniqueName: \"kubernetes.io/projected/1736aae6-d840-4b31-8c44-6637a05f37ef-kube-api-access-g6g99\") pod \"multus-vgbfn\" (UID: \"1736aae6-d840-4b31-8c44-6637a05f37ef\") " pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.236995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.237029 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.237038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.237053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.237065 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.251414 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.262170 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s46xz" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.273047 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vgbfn" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.276281 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dxdms"] Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.278368 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.280382 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbln8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.286257 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.286600 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.286604 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.287088 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.287004 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 15:07:11 crc kubenswrapper[4743]: W0310 15:07:11.288469 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b91cde_a621_4d27_a253_12a8effb3b0b.slice/crio-d51d3722004eb795585e93183ced58af576f4ed7882efed2bc9f7577c7c8f2df WatchSource:0}: Error finding container d51d3722004eb795585e93183ced58af576f4ed7882efed2bc9f7577c7c8f2df: Status 404 returned error can't find the container with id d51d3722004eb795585e93183ced58af576f4ed7882efed2bc9f7577c7c8f2df Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.291776 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.294037 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.294650 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.294699 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md2gq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-s46xz_openshift-multus(32b91cde-a621-4d27-a253-12a8effb3b0b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: W0310 15:07:11.295636 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1736aae6_d840_4b31_8c44_6637a05f37ef.slice/crio-d6e45ecf9eb9ae103a3e6af56bebe895cf44b4f9930a94fcf3709dc6387dd0ce WatchSource:0}: Error finding container d6e45ecf9eb9ae103a3e6af56bebe895cf44b4f9930a94fcf3709dc6387dd0ce: Status 404 returned error can't find the container with id d6e45ecf9eb9ae103a3e6af56bebe895cf44b4f9930a94fcf3709dc6387dd0ce Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.295711 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbln8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.296471 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-s46xz" podUID="32b91cde-a621-4d27-a253-12a8effb3b0b" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.297336 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.304679 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.305801 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:11 crc kubenswrapper[4743]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 15:07:11 crc kubenswrapper[4743]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 15:07:11 crc kubenswrapper[4743]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6g99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-vgbfn_openshift-multus(1736aae6-d840-4b31-8c44-6637a05f37ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:11 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.307080 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-vgbfn" podUID="1736aae6-d840-4b31-8c44-6637a05f37ef" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.319377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.332468 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.338665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.338794 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.338872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.338945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.339004 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.343839 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.353702 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.362434 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.375043 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.387947 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.399865 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.410688 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.411597 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-config\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.411771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-systemd-units\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.411991 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-var-lib-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412153 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-slash\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412257 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-ovn-kubernetes\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412376 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91ad6254-92fa-4092-8b86-2393f317f163-ovn-node-metrics-cert\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412618 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-env-overrides\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412800 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-netd\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnddq\" (UniqueName: \"kubernetes.io/projected/91ad6254-92fa-4092-8b86-2393f317f163-kube-api-access-xnddq\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.412993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-kubelet\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.413078 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-script-lib\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.413150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-ovn\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.413223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-log-socket\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.413319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-systemd\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.413405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-bin\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.413618 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-node-log\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.413791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-netns\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.413913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-etc-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.417678 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t9mrg" event={"ID":"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97","Type":"ContainerStarted","Data":"54431130abdcf743573ec00161a0045dda0377db2d925d01db36a1c16a252fdc"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.419101 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" event={"ID":"32b91cde-a621-4d27-a253-12a8effb3b0b","Type":"ContainerStarted","Data":"d51d3722004eb795585e93183ced58af576f4ed7882efed2bc9f7577c7c8f2df"} Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.419690 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:11 crc kubenswrapper[4743]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 15:07:11 crc kubenswrapper[4743]: set -uo pipefail Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 15:07:11 crc kubenswrapper[4743]: HOSTS_FILE="/etc/hosts" Mar 10 15:07:11 crc kubenswrapper[4743]: TEMP_FILE="/etc/hosts.tmp" Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: # Make a temporary file with the old hosts file's attributes. Mar 10 15:07:11 crc kubenswrapper[4743]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 15:07:11 crc kubenswrapper[4743]: echo "Failed to preserve hosts file. Exiting." Mar 10 15:07:11 crc kubenswrapper[4743]: exit 1 Mar 10 15:07:11 crc kubenswrapper[4743]: fi Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: while true; do Mar 10 15:07:11 crc kubenswrapper[4743]: declare -A svc_ips Mar 10 15:07:11 crc kubenswrapper[4743]: for svc in "${services[@]}"; do Mar 10 15:07:11 crc kubenswrapper[4743]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 15:07:11 crc kubenswrapper[4743]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 15:07:11 crc kubenswrapper[4743]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 15:07:11 crc kubenswrapper[4743]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 15:07:11 crc kubenswrapper[4743]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 15:07:11 crc kubenswrapper[4743]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 15:07:11 crc kubenswrapper[4743]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 15:07:11 crc kubenswrapper[4743]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 15:07:11 crc kubenswrapper[4743]: for i in ${!cmds[*]} Mar 10 15:07:11 crc kubenswrapper[4743]: do Mar 10 15:07:11 crc kubenswrapper[4743]: ips=($(eval "${cmds[i]}")) Mar 10 15:07:11 crc kubenswrapper[4743]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 15:07:11 crc kubenswrapper[4743]: svc_ips["${svc}"]="${ips[@]}" Mar 10 15:07:11 crc kubenswrapper[4743]: break Mar 10 15:07:11 crc kubenswrapper[4743]: fi Mar 10 15:07:11 crc kubenswrapper[4743]: done Mar 10 15:07:11 crc kubenswrapper[4743]: done Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: # Update /etc/hosts only if we get valid service IPs Mar 10 15:07:11 crc kubenswrapper[4743]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 15:07:11 crc kubenswrapper[4743]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 15:07:11 crc kubenswrapper[4743]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 15:07:11 crc kubenswrapper[4743]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 15:07:11 crc kubenswrapper[4743]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 15:07:11 crc kubenswrapper[4743]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 15:07:11 crc kubenswrapper[4743]: sleep 60 & wait Mar 10 15:07:11 crc kubenswrapper[4743]: continue Mar 10 15:07:11 crc kubenswrapper[4743]: fi Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: # Append resolver entries for services Mar 10 15:07:11 crc kubenswrapper[4743]: rc=0 Mar 10 15:07:11 crc kubenswrapper[4743]: for svc in "${!svc_ips[@]}"; do Mar 10 15:07:11 crc kubenswrapper[4743]: for ip in ${svc_ips[${svc}]}; do Mar 10 15:07:11 crc kubenswrapper[4743]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 15:07:11 crc kubenswrapper[4743]: done Mar 10 15:07:11 crc kubenswrapper[4743]: done Mar 10 15:07:11 crc kubenswrapper[4743]: if [[ $rc -ne 0 ]]; then Mar 10 15:07:11 crc kubenswrapper[4743]: sleep 60 & wait Mar 10 15:07:11 crc kubenswrapper[4743]: continue Mar 10 15:07:11 crc kubenswrapper[4743]: fi Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: Mar 10 15:07:11 crc kubenswrapper[4743]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 15:07:11 crc kubenswrapper[4743]: # Replace /etc/hosts with our modified version if needed Mar 10 15:07:11 crc kubenswrapper[4743]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 15:07:11 crc kubenswrapper[4743]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 15:07:11 crc kubenswrapper[4743]: fi Mar 10 15:07:11 crc kubenswrapper[4743]: sleep 60 & wait Mar 10 15:07:11 crc kubenswrapper[4743]: unset svc_ips Mar 10 15:07:11 crc kubenswrapper[4743]: done Mar 10 15:07:11 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjc2x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-t9mrg_openshift-dns(5d36817c-a2b7-49c1-92a9-2f9c54fd4f97): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:11 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.420237 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md2gq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-s46xz_openshift-multus(32b91cde-a621-4d27-a253-12a8effb3b0b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.420910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"37cd80ace186cc8cef8598078ad62386981e791e73ce62c8f9e4531786dbff62"} Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.421524 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-s46xz" podUID="32b91cde-a621-4d27-a253-12a8effb3b0b" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.422018 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-t9mrg" podUID="5d36817c-a2b7-49c1-92a9-2f9c54fd4f97" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.422227 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbln8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.422697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgbfn" event={"ID":"1736aae6-d840-4b31-8c44-6637a05f37ef","Type":"ContainerStarted","Data":"d6e45ecf9eb9ae103a3e6af56bebe895cf44b4f9930a94fcf3709dc6387dd0ce"} Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.423589 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:11 crc kubenswrapper[4743]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 15:07:11 crc kubenswrapper[4743]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 15:07:11 crc kubenswrapper[4743]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6g99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-vgbfn_openshift-multus(1736aae6-d840-4b31-8c44-6637a05f37ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:11 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.428140 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-vgbfn" podUID="1736aae6-d840-4b31-8c44-6637a05f37ef" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.428171 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbln8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.429398 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.429703 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.441160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.441210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.441219 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.441236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.441245 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.453446 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.465324 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.484195 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.495751 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.508674 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.516479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-ovn\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.516539 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-log-socket\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.516562 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-bin\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.516598 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-systemd\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.516617 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-node-log\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.516700 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-netns\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.516728 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-etc-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-var-lib-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-config\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-systemd-units\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-slash\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-ovn-kubernetes\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91ad6254-92fa-4092-8b86-2393f317f163-ovn-node-metrics-cert\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-env-overrides\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517338 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-netd\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-script-lib\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnddq\" (UniqueName: \"kubernetes.io/projected/91ad6254-92fa-4092-8b86-2393f317f163-kube-api-access-xnddq\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-kubelet\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-kubelet\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517652 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-ovn\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-log-socket\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517720 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-bin\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-systemd\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.517994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-node-log\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.518386 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-netns\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.518452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-etc-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.519469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-netd\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.519487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.519745 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.520118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-slash\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.520169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-systemd-units\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.520244 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-ovn-kubernetes\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.520242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-var-lib-openvswitch\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.520719 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-env-overrides\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.521721 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-config\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.522174 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-script-lib\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.523298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91ad6254-92fa-4092-8b86-2393f317f163-ovn-node-metrics-cert\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.526271 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.537419 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.543574 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnddq\" (UniqueName: \"kubernetes.io/projected/91ad6254-92fa-4092-8b86-2393f317f163-kube-api-access-xnddq\") pod \"ovnkube-node-dxdms\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.543906 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.543958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.543974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.543997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.544016 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.551011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.566601 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.579578 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.592940 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.601234 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.608566 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.621178 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: W0310 15:07:11.622399 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91ad6254_92fa_4092_8b86_2393f317f163.slice/crio-1b3547cbebac76cbd1b7a034ce35c266fc738100f85d153206f34093fae20903 WatchSource:0}: Error finding container 1b3547cbebac76cbd1b7a034ce35c266fc738100f85d153206f34093fae20903: Status 404 returned error can't find the container with id 1b3547cbebac76cbd1b7a034ce35c266fc738100f85d153206f34093fae20903 Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.625890 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:11 crc kubenswrapper[4743]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 15:07:11 crc kubenswrapper[4743]: apiVersion: v1 Mar 10 15:07:11 crc kubenswrapper[4743]: clusters: Mar 10 15:07:11 crc kubenswrapper[4743]: - cluster: Mar 10 15:07:11 crc kubenswrapper[4743]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 15:07:11 crc kubenswrapper[4743]: server: https://api-int.crc.testing:6443 Mar 10 15:07:11 crc kubenswrapper[4743]: name: default-cluster Mar 10 15:07:11 crc kubenswrapper[4743]: contexts: Mar 10 15:07:11 crc kubenswrapper[4743]: - context: Mar 10 15:07:11 crc kubenswrapper[4743]: cluster: default-cluster Mar 10 15:07:11 crc kubenswrapper[4743]: namespace: default Mar 10 15:07:11 crc kubenswrapper[4743]: user: default-auth Mar 10 15:07:11 crc kubenswrapper[4743]: name: default-context Mar 10 15:07:11 crc kubenswrapper[4743]: current-context: default-context Mar 10 15:07:11 crc kubenswrapper[4743]: kind: Config Mar 10 15:07:11 crc kubenswrapper[4743]: preferences: {} Mar 10 15:07:11 crc kubenswrapper[4743]: users: Mar 10 15:07:11 crc kubenswrapper[4743]: - name: default-auth Mar 10 15:07:11 crc kubenswrapper[4743]: user: Mar 10 15:07:11 crc kubenswrapper[4743]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 15:07:11 crc kubenswrapper[4743]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 15:07:11 crc kubenswrapper[4743]: EOF Mar 10 15:07:11 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xnddq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:11 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:11 crc kubenswrapper[4743]: E0310 15:07:11.627024 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.631767 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.646848 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.646887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.646897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.646914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.646927 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.749755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.749845 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.749866 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.749892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.749922 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.852706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.853073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.853137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.853269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.853334 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.956626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.956681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.956695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.956716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:11 crc kubenswrapper[4743]: I0310 15:07:11.956730 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:11Z","lastTransitionTime":"2026-03-10T15:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.058539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.058573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.058581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.058595 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.058605 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.161610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.161668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.161677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.161692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.161701 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.264657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.264705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.264715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.264733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.264746 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.367752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.367795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.367804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.367837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.367848 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.427580 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"1b3547cbebac76cbd1b7a034ce35c266fc738100f85d153206f34093fae20903"} Mar 10 15:07:12 crc kubenswrapper[4743]: E0310 15:07:12.429360 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:12 crc kubenswrapper[4743]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 15:07:12 crc kubenswrapper[4743]: apiVersion: v1 Mar 10 15:07:12 crc kubenswrapper[4743]: clusters: Mar 10 15:07:12 crc kubenswrapper[4743]: - cluster: Mar 10 15:07:12 crc kubenswrapper[4743]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 15:07:12 crc kubenswrapper[4743]: server: https://api-int.crc.testing:6443 Mar 10 15:07:12 crc kubenswrapper[4743]: name: default-cluster Mar 10 15:07:12 crc kubenswrapper[4743]: contexts: Mar 10 15:07:12 crc kubenswrapper[4743]: - context: Mar 10 15:07:12 crc kubenswrapper[4743]: cluster: default-cluster Mar 10 15:07:12 crc kubenswrapper[4743]: namespace: default Mar 10 15:07:12 crc kubenswrapper[4743]: user: default-auth Mar 10 15:07:12 crc kubenswrapper[4743]: name: default-context Mar 10 15:07:12 crc kubenswrapper[4743]: current-context: default-context Mar 10 15:07:12 crc kubenswrapper[4743]: kind: Config Mar 10 15:07:12 crc kubenswrapper[4743]: preferences: {} Mar 10 15:07:12 crc kubenswrapper[4743]: users: Mar 10 15:07:12 crc kubenswrapper[4743]: - name: default-auth Mar 10 15:07:12 crc kubenswrapper[4743]: user: Mar 10 15:07:12 crc kubenswrapper[4743]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 15:07:12 crc kubenswrapper[4743]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 15:07:12 crc kubenswrapper[4743]: EOF Mar 10 15:07:12 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xnddq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:12 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:12 crc kubenswrapper[4743]: E0310 15:07:12.430575 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.446253 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.462051 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.471277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.471380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.471407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.471446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.471474 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.478726 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.492234 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.518919 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.538848 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.553323 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.575586 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.575652 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.575675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.575714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.575753 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.577854 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.594383 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.608804 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.623138 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.637574 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.652066 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.678500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.678530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.678538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.678555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.678565 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.780874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.780926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.780943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.780966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.780982 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.884261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.884304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.884316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.884336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.884349 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.914746 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.914797 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:12 crc kubenswrapper[4743]: E0310 15:07:12.915028 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.915082 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:12 crc kubenswrapper[4743]: E0310 15:07:12.915246 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:12 crc kubenswrapper[4743]: E0310 15:07:12.915612 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:12 crc kubenswrapper[4743]: E0310 15:07:12.918356 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:12 crc kubenswrapper[4743]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 15:07:12 crc kubenswrapper[4743]: set -o allexport Mar 10 15:07:12 crc kubenswrapper[4743]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 15:07:12 crc kubenswrapper[4743]: source /etc/kubernetes/apiserver-url.env Mar 10 15:07:12 crc kubenswrapper[4743]: else Mar 10 15:07:12 crc kubenswrapper[4743]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 15:07:12 crc kubenswrapper[4743]: exit 1 Mar 10 15:07:12 crc kubenswrapper[4743]: fi Mar 10 15:07:12 crc kubenswrapper[4743]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 15:07:12 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:12 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:07:12 crc kubenswrapper[4743]: E0310 15:07:12.919535 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.986965 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.987020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.987034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.987055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:12 crc kubenswrapper[4743]: I0310 15:07:12.987075 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:12Z","lastTransitionTime":"2026-03-10T15:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.091463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.091519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.091534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.091556 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.091567 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.195027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.195125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.195150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.195184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.195209 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.297360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.297415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.297429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.297450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.297465 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.400266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.400331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.400346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.400370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.400388 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.503891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.503950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.503964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.503982 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.503995 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.518913 4743 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.607274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.607337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.607348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.607369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.607387 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.709540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.709580 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.709588 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.709602 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.709610 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.812771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.812855 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.812865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.812883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.812894 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.916019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.916123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.916143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.916174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:13 crc kubenswrapper[4743]: I0310 15:07:13.916193 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:13Z","lastTransitionTime":"2026-03-10T15:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.019572 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.019621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.019633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.019650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.019662 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.122575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.122628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.122640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.122657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.122670 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.225370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.225463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.225477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.225496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.225509 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.328619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.328676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.328684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.328699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.328708 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.432615 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.432654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.432663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.432680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.432690 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.535616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.535705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.535733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.535767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.535788 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.640313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.640379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.640399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.640424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.640442 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.744131 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.744231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.744245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.744272 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.744286 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.847740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.847873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.847894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.847924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.847944 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.914455 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.914543 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:14 crc kubenswrapper[4743]: E0310 15:07:14.914721 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.914744 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:14 crc kubenswrapper[4743]: E0310 15:07:14.915037 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:14 crc kubenswrapper[4743]: E0310 15:07:14.915166 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.916470 4743 scope.go:117] "RemoveContainer" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.953135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.953195 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.953209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.953235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:14 crc kubenswrapper[4743]: I0310 15:07:14.953254 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:14Z","lastTransitionTime":"2026-03-10T15:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.057074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.057138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.057152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.057172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.057187 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.161377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.161429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.161449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.161469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.161481 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.264219 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.264274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.264286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.264301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.264311 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.367438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.367524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.367557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.367579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.367592 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.443689 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.443757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.446574 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.448681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.449036 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.464112 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.470231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.470277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.470291 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.470309 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.470324 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.487204 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.500493 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.512280 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.521471 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.530939 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.543060 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.554703 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.566151 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.573450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.573511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.573529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.573553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.573567 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.613125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.641914 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.658944 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.671544 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.680341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.680396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.680409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.680427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.680442 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.689664 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.701788 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.718092 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.731506 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.745254 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.757933 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.770753 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.783493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.783526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.783534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.783548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.783572 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.786988 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.802062 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.819191 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.832676 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.847206 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.865774 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.887298 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.887366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.887379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.887400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.887414 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.930049 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.943573 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.955197 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.968152 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.982153 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.990192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.990238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.990253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.990271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.990282 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:15Z","lastTransitionTime":"2026-03-10T15:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:15 crc kubenswrapper[4743]: I0310 15:07:15.996011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.005182 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.019507 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.041220 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.054746 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.067113 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.081096 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.092569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.092605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.092615 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.092631 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.092641 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.093616 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.195145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.195200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.195209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.195226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.195236 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.297344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.297390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.297399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.297416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.297426 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.400288 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.400324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.400333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.400348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.400359 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.502525 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.502589 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.502602 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.502623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.502636 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.605115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.605152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.605161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.605175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.605184 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.707871 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.707917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.707931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.707949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.707963 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.810400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.810446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.810456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.810472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.810486 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.898992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.899046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.899059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.899080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.899095 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.911174 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.914059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.914103 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.914113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.914145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.914155 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.914307 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.914475 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.914536 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.914525 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.914677 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.914842 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.926548 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.930591 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.930627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.930638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.930654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.930667 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.950717 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.954421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.954481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.954502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.954524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.954542 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.966773 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.970939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.970989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.971005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.971025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.971039 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.985896 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:16 crc kubenswrapper[4743]: E0310 15:07:16.986007 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.987579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.987631 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.987646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.987670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:16 crc kubenswrapper[4743]: I0310 15:07:16.987690 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:16Z","lastTransitionTime":"2026-03-10T15:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.090271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.090329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.090338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.090354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.090385 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.114913 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rrjqn"] Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.115480 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.117984 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.118634 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.119470 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.119493 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.132745 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.148200 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.164992 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.174739 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ba758a8-e906-48f3-ae10-c045f908306e-serviceca\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.174802 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqbpj\" (UniqueName: \"kubernetes.io/projected/6ba758a8-e906-48f3-ae10-c045f908306e-kube-api-access-wqbpj\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.174858 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ba758a8-e906-48f3-ae10-c045f908306e-host\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.179679 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.189730 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.192403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.192458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.192469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.192486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.192495 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.202482 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.213776 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.225877 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.239985 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.252057 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.264540 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.275311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ba758a8-e906-48f3-ae10-c045f908306e-host\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.275387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ba758a8-e906-48f3-ae10-c045f908306e-serviceca\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.275406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ba758a8-e906-48f3-ae10-c045f908306e-host\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.275431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqbpj\" (UniqueName: \"kubernetes.io/projected/6ba758a8-e906-48f3-ae10-c045f908306e-kube-api-access-wqbpj\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.276696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ba758a8-e906-48f3-ae10-c045f908306e-serviceca\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.278595 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.289772 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.292920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqbpj\" (UniqueName: \"kubernetes.io/projected/6ba758a8-e906-48f3-ae10-c045f908306e-kube-api-access-wqbpj\") pod \"node-ca-rrjqn\" (UID: \"6ba758a8-e906-48f3-ae10-c045f908306e\") " pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.294526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.294566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.294578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.294596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.294613 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.306436 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.397195 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.397240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.397249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.397263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.397273 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.427920 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rrjqn" Mar 10 15:07:17 crc kubenswrapper[4743]: W0310 15:07:17.440540 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba758a8_e906_48f3_ae10_c045f908306e.slice/crio-84124984bedf3c9dee973de4680d04baab9ffce299b12a6a410b2dbf3e5a46c8 WatchSource:0}: Error finding container 84124984bedf3c9dee973de4680d04baab9ffce299b12a6a410b2dbf3e5a46c8: Status 404 returned error can't find the container with id 84124984bedf3c9dee973de4680d04baab9ffce299b12a6a410b2dbf3e5a46c8 Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.456709 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rrjqn" event={"ID":"6ba758a8-e906-48f3-ae10-c045f908306e","Type":"ContainerStarted","Data":"84124984bedf3c9dee973de4680d04baab9ffce299b12a6a410b2dbf3e5a46c8"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.506541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.506581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.506592 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.506607 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.506618 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.608950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.608992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.609001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.609021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.609031 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.712599 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.712696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.712722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.712754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.712779 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.817455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.817493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.817505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.817522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.817535 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.921175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.921224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.921237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.921253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:17 crc kubenswrapper[4743]: I0310 15:07:17.921266 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:17Z","lastTransitionTime":"2026-03-10T15:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.024171 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.024211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.024222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.024238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.024247 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.127399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.127439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.127448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.127463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.127473 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.230750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.230857 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.230881 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.230912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.230934 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.334560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.334608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.334619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.334639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.334651 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.437455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.437502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.437510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.437528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.437540 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.461132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rrjqn" event={"ID":"6ba758a8-e906-48f3-ae10-c045f908306e","Type":"ContainerStarted","Data":"8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.477743 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.501660 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.517539 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.528371 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.540942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.541002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.541018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.541373 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.541397 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.542596 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.560668 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.577182 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.594348 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.609671 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.627701 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.641575 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.643745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.644091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.644132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.644169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.644193 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.666901 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.682358 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.696275 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.746337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.746377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.746392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.746410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.746420 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.849061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.849099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.849112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.849132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.849145 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.915145 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.915363 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:18 crc kubenswrapper[4743]: E0310 15:07:18.915465 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.915484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:18 crc kubenswrapper[4743]: E0310 15:07:18.915653 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:18 crc kubenswrapper[4743]: E0310 15:07:18.916957 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.952541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.952587 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.952622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.952640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:18 crc kubenswrapper[4743]: I0310 15:07:18.952654 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:18Z","lastTransitionTime":"2026-03-10T15:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.056039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.056119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.056141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.056165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.056183 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.159869 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.159914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.159923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.159941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.159952 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.262051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.262119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.262129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.262149 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.262168 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.365181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.365226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.365235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.365251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.365261 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.467556 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.467616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.467629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.467649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.467665 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.570561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.570597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.570606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.570621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.570633 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.673101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.673140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.673149 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.673165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.673175 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.775960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.776014 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.776025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.776045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.776057 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.878824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.878874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.878884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.878902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.878912 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.982172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.982221 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.982230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.982247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:19 crc kubenswrapper[4743]: I0310 15:07:19.982259 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:19Z","lastTransitionTime":"2026-03-10T15:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.085142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.085210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.085227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.085256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.085274 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.188208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.188288 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.188305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.188334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.188352 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.291253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.291316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.291333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.291359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.291374 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.394458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.394512 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.394528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.394552 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.394563 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.498841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.498894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.498909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.498931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.498941 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.602256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.602312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.602322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.602339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.602349 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.705133 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.705164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.705172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.705185 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.705193 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.809043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.809122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.809137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.809162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.809177 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.911511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.911551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.911562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.911581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.911593 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:20Z","lastTransitionTime":"2026-03-10T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.914376 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:20 crc kubenswrapper[4743]: E0310 15:07:20.914488 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.914901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:20 crc kubenswrapper[4743]: E0310 15:07:20.915000 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:20 crc kubenswrapper[4743]: I0310 15:07:20.915086 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:20 crc kubenswrapper[4743]: E0310 15:07:20.915172 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.015438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.015500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.015514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.015536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.015549 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.118908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.119026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.119090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.119114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.119142 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:53.1190906 +0000 UTC m=+137.825905378 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.119083 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.119217 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.119184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.119297 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:53.119277216 +0000 UTC m=+137.826091964 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.119302 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.119342 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.119460 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.119539 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:53.119530123 +0000 UTC m=+137.826344871 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.220836 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.220957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.221089 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.221128 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.221147 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.221165 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.221179 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.221198 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.221270 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:53.221244199 +0000 UTC m=+137.928058947 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:21 crc kubenswrapper[4743]: E0310 15:07:21.221315 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:53.22128727 +0000 UTC m=+137.928102018 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.223694 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.223793 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.223872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.223911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.223935 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.326018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.326089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.326112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.326144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.326160 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.430646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.430720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.430733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.430755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.430775 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.473370 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.496686 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.510458 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.533780 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.533886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.533949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.533966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.533991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.534008 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.551337 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.570167 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.592441 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.609196 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.628426 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.638006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.638076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.638099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.638148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.638172 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.647566 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.667153 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.681570 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.694081 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.710162 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.730228 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.741446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.741490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.741502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.741521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.741535 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.844227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.844266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.844274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.844289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.844299 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.947356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.947403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.947413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.947429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:21 crc kubenswrapper[4743]: I0310 15:07:21.947443 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:21Z","lastTransitionTime":"2026-03-10T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.051413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.051458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.051467 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.051483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.051495 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.153899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.153938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.153947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.153964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.153973 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.275560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.275629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.275650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.275679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.275697 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.379254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.379865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.380070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.380318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.380538 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.477102 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t9mrg" event={"ID":"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97","Type":"ContainerStarted","Data":"f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.483264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.483477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.483554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.483627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.483690 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.489187 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.498705 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.509495 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.527590 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.542388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.555135 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.567605 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.579336 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.587020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.587269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.587398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.587531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.587648 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.592118 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.614065 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.626012 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.639200 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.659699 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.691264 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.693422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.693497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.693515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.693539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.693556 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.797030 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.797293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.797303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.797320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.797329 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.899158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.899224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.899241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.899265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.899281 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:22Z","lastTransitionTime":"2026-03-10T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.914656 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.914949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.914954 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:22 crc kubenswrapper[4743]: E0310 15:07:22.915069 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:22 crc kubenswrapper[4743]: E0310 15:07:22.915959 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:22 crc kubenswrapper[4743]: E0310 15:07:22.916792 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:22 crc kubenswrapper[4743]: I0310 15:07:22.942499 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.004354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.004412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.004431 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.004458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.004480 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.101023 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7"] Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.101573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.103640 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.103719 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.110397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.110445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.110457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.110474 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.110490 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.116941 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.128181 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.141871 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5781a455-9df4-408a-9b78-9055f53dc9e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.141926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5781a455-9df4-408a-9b78-9055f53dc9e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.141996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5781a455-9df4-408a-9b78-9055f53dc9e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.142088 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqrfp\" (UniqueName: \"kubernetes.io/projected/5781a455-9df4-408a-9b78-9055f53dc9e6-kube-api-access-hqrfp\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.145477 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.161117 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.172916 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.186212 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.202931 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.213172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.213210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.213220 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.213237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.213247 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.220920 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.235388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.243102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5781a455-9df4-408a-9b78-9055f53dc9e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.243195 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5781a455-9df4-408a-9b78-9055f53dc9e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.243265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5781a455-9df4-408a-9b78-9055f53dc9e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.243313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqrfp\" (UniqueName: \"kubernetes.io/projected/5781a455-9df4-408a-9b78-9055f53dc9e6-kube-api-access-hqrfp\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.244070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5781a455-9df4-408a-9b78-9055f53dc9e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.244091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5781a455-9df4-408a-9b78-9055f53dc9e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.248614 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.253991 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5781a455-9df4-408a-9b78-9055f53dc9e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.259427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqrfp\" (UniqueName: \"kubernetes.io/projected/5781a455-9df4-408a-9b78-9055f53dc9e6-kube-api-access-hqrfp\") pod \"ovnkube-control-plane-749d76644c-nmxc7\" (UID: \"5781a455-9df4-408a-9b78-9055f53dc9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.268249 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.280692 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.293121 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.315892 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.316106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.316138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.316148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.316163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.316173 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.330556 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.354158 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.419435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.419489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.419500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.419520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.419534 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.454623 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" Mar 10 15:07:23 crc kubenswrapper[4743]: W0310 15:07:23.470593 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5781a455_9df4_408a_9b78_9055f53dc9e6.slice/crio-ee5db03d12c8bf2237fa9de5808438cebcf75ae0b93adbfe3498be3fb2437914 WatchSource:0}: Error finding container ee5db03d12c8bf2237fa9de5808438cebcf75ae0b93adbfe3498be3fb2437914: Status 404 returned error can't find the container with id ee5db03d12c8bf2237fa9de5808438cebcf75ae0b93adbfe3498be3fb2437914 Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.484422 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8" exitCode=0 Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.484577 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.486845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" event={"ID":"5781a455-9df4-408a-9b78-9055f53dc9e6","Type":"ContainerStarted","Data":"ee5db03d12c8bf2237fa9de5808438cebcf75ae0b93adbfe3498be3fb2437914"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.489446 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgbfn" event={"ID":"1736aae6-d840-4b31-8c44-6637a05f37ef","Type":"ContainerStarted","Data":"403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.492321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.492392 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.501482 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.517460 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.523684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.523740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.523749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.523770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.523782 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.534731 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.550972 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.567406 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.583418 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.596899 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.609654 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.626544 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.627967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.627996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.628005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.628020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.628031 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.650253 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.673851 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.690756 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.709318 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.729397 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.731343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.731401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.731414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.731455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.731471 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.748046 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.768088 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.782337 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.797597 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.816207 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.832988 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.834806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.834858 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.834870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.834890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.834905 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.849613 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.865634 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vcq2w"] Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.865750 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.866292 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:23 crc kubenswrapper[4743]: E0310 15:07:23.866576 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.889703 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.901961 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.917945 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.930698 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.937229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.937260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.937270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.937282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.937293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:23Z","lastTransitionTime":"2026-03-10T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.940103 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.950170 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rblb\" (UniqueName: \"kubernetes.io/projected/acbc8434-7aab-481b-ae0e-08696da082ad-kube-api-access-2rblb\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.950283 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.964376 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:23 crc kubenswrapper[4743]: I0310 15:07:23.996208 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.015491 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.029373 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.040666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.040698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.040712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.040727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.040738 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.044963 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.051331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rblb\" (UniqueName: \"kubernetes.io/projected/acbc8434-7aab-481b-ae0e-08696da082ad-kube-api-access-2rblb\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.051402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:24 crc kubenswrapper[4743]: E0310 15:07:24.051504 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:24 crc kubenswrapper[4743]: E0310 15:07:24.051579 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs podName:acbc8434-7aab-481b-ae0e-08696da082ad nodeName:}" failed. No retries permitted until 2026-03-10 15:07:24.551565633 +0000 UTC m=+109.258380381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs") pod "network-metrics-daemon-vcq2w" (UID: "acbc8434-7aab-481b-ae0e-08696da082ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.061923 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.072509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rblb\" (UniqueName: \"kubernetes.io/projected/acbc8434-7aab-481b-ae0e-08696da082ad-kube-api-access-2rblb\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.079229 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.093076 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.121312 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.137268 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.142980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.143041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.143056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.143077 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.143089 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.158435 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.171738 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.182365 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.194488 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.207312 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.221202 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.231936 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.242182 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.246181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.246300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.246385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.246460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.246524 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.256680 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.274042 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.290911 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.308141 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.322721 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.336656 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.350595 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.350669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.350688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.350719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.350741 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.453296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.453334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.453345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.453359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.453370 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.497646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.497741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.497753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.497765 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.499550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" event={"ID":"5781a455-9df4-408a-9b78-9055f53dc9e6","Type":"ContainerStarted","Data":"80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.499604 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" event={"ID":"5781a455-9df4-408a-9b78-9055f53dc9e6","Type":"ContainerStarted","Data":"9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.513295 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.527443 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.543578 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.557320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.557402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.557424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.557435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.557452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.557466 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: E0310 15:07:24.558266 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:24 crc kubenswrapper[4743]: E0310 15:07:24.558326 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs podName:acbc8434-7aab-481b-ae0e-08696da082ad nodeName:}" failed. No retries permitted until 2026-03-10 15:07:25.5583063 +0000 UTC m=+110.265121038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs") pod "network-metrics-daemon-vcq2w" (UID: "acbc8434-7aab-481b-ae0e-08696da082ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.561325 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.575349 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.588623 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.599984 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.615069 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.626799 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.639848 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.654505 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.660692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.661233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.661249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.661269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.661281 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.676168 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.698757 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.718380 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.733049 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.747283 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.764120 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.764488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.764563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.764578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.764596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.764607 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.780432 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:24Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.867483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.867550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.867569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.867598 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.867615 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.915076 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.915173 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.915259 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:24 crc kubenswrapper[4743]: E0310 15:07:24.915214 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:24 crc kubenswrapper[4743]: E0310 15:07:24.915387 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:24 crc kubenswrapper[4743]: E0310 15:07:24.915492 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.970701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.970748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.970760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.970776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4743]: I0310 15:07:24.970787 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.072861 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.072901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.072911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.072924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.072933 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.174934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.175001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.175014 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.175032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.175046 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.278018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.278059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.278074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.278094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.278110 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.380894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.380944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.380954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.380968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.380978 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.484074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.484143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.484155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.484173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.484187 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.504733 4743 generic.go:334] "Generic (PLEG): container finished" podID="32b91cde-a621-4d27-a253-12a8effb3b0b" containerID="5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488" exitCode=0 Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.504796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" event={"ID":"32b91cde-a621-4d27-a253-12a8effb3b0b","Type":"ContainerDied","Data":"5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.509591 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.509657 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.518450 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.531725 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.552318 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.565891 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.566048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:25 crc kubenswrapper[4743]: E0310 15:07:25.566194 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:25 crc kubenswrapper[4743]: E0310 15:07:25.566257 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs podName:acbc8434-7aab-481b-ae0e-08696da082ad nodeName:}" failed. No retries permitted until 2026-03-10 15:07:27.566237941 +0000 UTC m=+112.273052739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs") pod "network-metrics-daemon-vcq2w" (UID: "acbc8434-7aab-481b-ae0e-08696da082ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.583448 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.587320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.587356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.587365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.587380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.587389 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.598779 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.614529 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.626715 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.649140 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.664006 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.678690 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.690151 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.690199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.690209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.690229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.690241 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.693367 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.704588 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.716138 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.729106 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.744569 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.760017 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.776186 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.793493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.793550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.793563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.793589 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.793602 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.896146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.896760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.896788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.896866 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.896893 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:25Z","lastTransitionTime":"2026-03-10T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.914711 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:25 crc kubenswrapper[4743]: E0310 15:07:25.914940 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.932342 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.944899 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.962208 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.978207 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:25 crc kubenswrapper[4743]: I0310 15:07:25.994662 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:25.999994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.000024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.000054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.000070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.000080 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.011072 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.029613 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.043529 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.054114 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.066621 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.080374 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.094306 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.103027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.103072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.103086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.103116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.103131 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.108195 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.119983 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.130549 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.148056 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.166795 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.184509 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.205134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.205184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.205196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.205215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.205228 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.308379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.308444 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.308470 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.308507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.308524 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.411749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.411791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.411800 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.411828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.411839 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.513896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.513953 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.513970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.514026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.514044 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.515345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.517901 4743 generic.go:334] "Generic (PLEG): container finished" podID="32b91cde-a621-4d27-a253-12a8effb3b0b" containerID="9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac" exitCode=0 Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.517953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" event={"ID":"32b91cde-a621-4d27-a253-12a8effb3b0b","Type":"ContainerDied","Data":"9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.530785 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.546843 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.562612 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.575826 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.590162 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.602573 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.616789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.616846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.616857 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.616870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.616878 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.617381 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.631641 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.646563 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.659519 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.675125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.693521 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.716031 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.720718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.720773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.720785 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.720803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.720843 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.738224 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.754936 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.773147 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.792897 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.810537 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.824147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.824216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.824228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.824247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.824262 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.825624 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.841713 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.861775 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.895008 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.914371 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.914453 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:26 crc kubenswrapper[4743]: E0310 15:07:26.914528 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.914550 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:26 crc kubenswrapper[4743]: E0310 15:07:26.914700 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:26 crc kubenswrapper[4743]: E0310 15:07:26.914777 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.927378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.927431 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.927445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.927471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.927489 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.933614 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:26 crc kubenswrapper[4743]: I0310 15:07:26.971929 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.013605 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.030773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.030839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.030854 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.030872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.030885 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.058648 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.095876 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.133717 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.135067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.135139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.135153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.135176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.135190 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.176347 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.217057 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.237487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.237531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.237545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.237564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.237578 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.253613 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.293321 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.323509 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.323542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.323551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.323569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.323579 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.333314 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.336450 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.340462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.340541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.340557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.340576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.340588 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.357107 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.361699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.361962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.362055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.362144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.362220 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.375772 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.381138 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.381851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.381894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.381906 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.381924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.381937 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.395543 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.399661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.399698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.399710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.399730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.399742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.413048 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.413155 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.414768 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.415348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.415410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.415425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.415445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.415458 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.421747 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.488740 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.513873 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.517475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.517651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.517759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.517867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.517952 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.523301 4743 generic.go:334] "Generic (PLEG): container finished" podID="32b91cde-a621-4d27-a253-12a8effb3b0b" containerID="70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87" exitCode=0 Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.523379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" event={"ID":"32b91cde-a621-4d27-a253-12a8effb3b0b","Type":"ContainerDied","Data":"70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.527771 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.534071 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.578791 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.590760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.590952 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.591024 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs podName:acbc8434-7aab-481b-ae0e-08696da082ad nodeName:}" failed. No retries permitted until 2026-03-10 15:07:31.591005205 +0000 UTC m=+116.297819953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs") pod "network-metrics-daemon-vcq2w" (UID: "acbc8434-7aab-481b-ae0e-08696da082ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.615631 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.626552 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.626605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.626616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.626639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.626651 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.667408 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.694395 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.729043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.729080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.729091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.729108 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.729121 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.737109 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.776125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.815611 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.831675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.831736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.831748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.831778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.831791 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.853583 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.894337 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.915219 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:27 crc kubenswrapper[4743]: E0310 15:07:27.915565 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.931728 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.934409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.934893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.934999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.935097 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.935183 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4743]: I0310 15:07:27.973475 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.014483 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.037749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.037792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.037805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.037841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.037855 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.053222 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.093569 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.132971 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.142095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.142134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.142146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.142161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.142173 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.175102 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.215063 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.244827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.244873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.244882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.244914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.244928 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.251479 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.292246 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.341565 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.347668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.347731 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.347745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.347783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.347799 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.380862 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.416559 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.451594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.451662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.451676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.451702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.451722 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.453777 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.495882 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.535336 4743 generic.go:334] "Generic (PLEG): container finished" podID="32b91cde-a621-4d27-a253-12a8effb3b0b" containerID="7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2" exitCode=0 Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.535419 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" event={"ID":"32b91cde-a621-4d27-a253-12a8effb3b0b","Type":"ContainerDied","Data":"7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.538169 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.555140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.555322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.555425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.555526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.555589 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.577558 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.620280 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.654779 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.658150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.658188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.658199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.658216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.658227 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.694236 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.733591 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.761077 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.761126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.761141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.761163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.761178 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.771624 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.815597 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.852478 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.864634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.864703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.864716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.864786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.864804 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.900253 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.916627 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.916778 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:28 crc kubenswrapper[4743]: E0310 15:07:28.916974 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.917308 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:28 crc kubenswrapper[4743]: E0310 15:07:28.917506 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:28 crc kubenswrapper[4743]: E0310 15:07:28.917624 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.945037 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.967972 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.968015 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.968026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.968043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.968053 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4743]: I0310 15:07:28.977742 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:28Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.023305 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.053722 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.070262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.070296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.070305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.070326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.070338 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.093435 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.136339 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.172803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.172858 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.172868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.172886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.172897 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.173608 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.211903 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.252517 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.275500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.275546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.275556 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.275573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.275584 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.293349 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.331083 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.376422 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.378173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.378201 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.378211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.378228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.378238 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.412755 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.455120 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.480847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.480891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.480903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.480921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.480932 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.493248 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.534625 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.544569 4743 generic.go:334] "Generic (PLEG): container finished" podID="32b91cde-a621-4d27-a253-12a8effb3b0b" containerID="8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f" exitCode=0 Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.544658 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" event={"ID":"32b91cde-a621-4d27-a253-12a8effb3b0b","Type":"ContainerDied","Data":"8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.552449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.553117 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.553165 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.553185 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.577357 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.579893 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.581801 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.583278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.583344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.583358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.583379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.583394 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.617444 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.653097 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.686364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.686409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.686419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.686438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.686451 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.692498 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.734840 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.774369 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.789089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.789129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.789138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.789156 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.789167 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.813043 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.853139 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.892393 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.893250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.893312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.893327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.893362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.893375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.915005 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:29 crc kubenswrapper[4743]: E0310 15:07:29.915180 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.934966 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.974536 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:29Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.997164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.997224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.997235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.997257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4743]: I0310 15:07:29.997270 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.015840 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.053451 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.098063 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.100961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.101018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.101032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.101054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.101070 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.133239 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.172710 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.205118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.205168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.205182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.205207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.205221 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.216910 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.262146 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.299157 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.307720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.307754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.307765 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.307779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.307789 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.336653 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.409917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.410029 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.410044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.410060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.410072 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.513132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.513168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.513181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.513198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.513210 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.560983 4743 generic.go:334] "Generic (PLEG): container finished" podID="32b91cde-a621-4d27-a253-12a8effb3b0b" containerID="a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de" exitCode=0 Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.561047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" event={"ID":"32b91cde-a621-4d27-a253-12a8effb3b0b","Type":"ContainerDied","Data":"a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.576412 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.597006 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.619575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.619621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.619634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.619654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.619671 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.622364 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.640467 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.656921 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.719706 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.730167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.730226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.730238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.730260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.730274 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.747876 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.774198 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.815755 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.833697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.833743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.833754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.833771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.833782 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.840668 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.856150 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.869425 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.881800 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.893364 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.915314 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.915359 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:30 crc kubenswrapper[4743]: E0310 15:07:30.915459 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.915480 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:30 crc kubenswrapper[4743]: E0310 15:07:30.915546 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:30 crc kubenswrapper[4743]: E0310 15:07:30.915603 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.933494 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.942482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.942515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.942523 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.942537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.942547 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4743]: I0310 15:07:30.976614 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:30Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.012703 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.044686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.044712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.044721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.044737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.044747 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.058177 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.147466 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.147508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.147520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.147538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.147550 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.250541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.250587 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.250598 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.250617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.250626 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.353434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.353483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.353495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.353529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.353540 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.456669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.456731 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.456781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.456844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.456872 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.559436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.559465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.559474 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.559489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.559498 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.568383 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" event={"ID":"32b91cde-a621-4d27-a253-12a8effb3b0b","Type":"ContainerStarted","Data":"016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.588436 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.609008 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.630194 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.644775 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.648250 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:31 crc kubenswrapper[4743]: E0310 15:07:31.648530 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:31 crc kubenswrapper[4743]: E0310 15:07:31.648886 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs podName:acbc8434-7aab-481b-ae0e-08696da082ad nodeName:}" failed. No retries permitted until 2026-03-10 15:07:39.648783511 +0000 UTC m=+124.355598259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs") pod "network-metrics-daemon-vcq2w" (UID: "acbc8434-7aab-481b-ae0e-08696da082ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.661311 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.663018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.663064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.663078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.663099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.663113 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.677555 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.693467 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.707513 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.721962 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.738342 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.758239 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.767047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.767095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.767108 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.767163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.767179 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.775125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.791298 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.809581 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.825494 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.840863 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.852125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.869751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.869840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.869856 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.869883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.869901 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.871072 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:31Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.915162 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:31 crc kubenswrapper[4743]: E0310 15:07:31.915312 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.974072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.974118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.974136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.974157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4743]: I0310 15:07:31.974170 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.077686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.077753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.077766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.077789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.077802 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.181041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.181105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.181122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.181144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.181158 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.283386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.283448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.283461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.283481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.283494 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.386128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.386180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.386193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.386214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.386230 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.490005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.490089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.490107 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.490138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.490161 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.579262 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/0.log" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.582830 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0" exitCode=1 Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.582854 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.584133 4743 scope.go:117] "RemoveContainer" containerID="01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.599008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.599079 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.599094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.599120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.599135 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.620635 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.638059 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.669374 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"message\\\":\\\"ointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.970423 6600 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 15:07:31.970654 6600 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.970923 6600 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:07:31.971049 6600 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.971254 6600 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 15:07:31.971620 6600 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:31.971645 6600 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:07:31.971660 6600 factory.go:656] Stopping watch factory\\\\nI0310 15:07:31.971676 6600 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:31.971712 6600 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.686930 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.704834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.704883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.704919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.704948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.704968 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.707474 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.721743 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.736625 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.750097 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.763800 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.783909 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.800291 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.809439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.809543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.809576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.809603 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.809623 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.816715 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.831656 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.844452 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.856102 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.865181 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.875185 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.886121 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:32Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.914555 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.914632 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.914676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:32 crc kubenswrapper[4743]: E0310 15:07:32.914726 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:32 crc kubenswrapper[4743]: E0310 15:07:32.914857 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:32 crc kubenswrapper[4743]: E0310 15:07:32.914946 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.914981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.915034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.915052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.915077 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4743]: I0310 15:07:32.915092 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.017644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.017674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.017682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.017698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.017708 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.120709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.120783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.120798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.120843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.120857 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.223390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.223428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.223439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.223502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.223517 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.326359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.326423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.326433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.326450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.326460 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.429355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.429407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.429416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.429433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.429445 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.533101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.533158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.533171 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.533191 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.533205 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.589027 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/1.log" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.589851 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/0.log" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.593090 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6" exitCode=1 Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.593175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.593257 4743 scope.go:117] "RemoveContainer" containerID="01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.594222 4743 scope.go:117] "RemoveContainer" containerID="4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6" Mar 10 15:07:33 crc kubenswrapper[4743]: E0310 15:07:33.594441 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.631216 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.635714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.635771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.635783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.635807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.635860 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.649268 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.677766 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"message\\\":\\\"ointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.970423 6600 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 15:07:31.970654 6600 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.970923 6600 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:07:31.971049 6600 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.971254 6600 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 15:07:31.971620 6600 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:31.971645 6600 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:07:31.971660 6600 factory.go:656] Stopping watch factory\\\\nI0310 15:07:31.971676 6600 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:31.971712 6600 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:33Z\\\",\\\"message\\\":\\\"VN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 15:07:33.522736 6771 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.693341 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.711046 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.737152 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.739943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.739987 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.739999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.740019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.740033 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.760124 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.772713 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.784703 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.797106 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.816322 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.832277 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.844024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.844089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.844102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.844123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.844136 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.850309 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.864286 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.875434 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.886046 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.896445 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.907474 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.914935 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:33 crc kubenswrapper[4743]: E0310 15:07:33.915143 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.947635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.948210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.948257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.948283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4743]: I0310 15:07:33.948490 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.053760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.053841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.053854 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.053874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.053886 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.156807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.156892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.156905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.156926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.156939 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.260506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.260564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.260574 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.260596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.260609 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.363898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.363956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.363967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.363988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.364000 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.467024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.467100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.467117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.467137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.467151 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.571447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.571942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.571963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.571987 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.572005 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.599601 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/1.log" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.676277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.676346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.676364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.676394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.676418 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.779475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.779540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.779555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.779579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.779595 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.881900 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.881950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.881963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.881979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.881989 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.914918 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.914931 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:34 crc kubenswrapper[4743]: E0310 15:07:34.915101 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.915164 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:34 crc kubenswrapper[4743]: E0310 15:07:34.915233 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:34 crc kubenswrapper[4743]: E0310 15:07:34.915433 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.984072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.984119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.984130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.984147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4743]: I0310 15:07:34.984159 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.086902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.086956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.086965 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.086983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.086994 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.191159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.191218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.191235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.191261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.191279 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.294797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.294888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.294905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.294936 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.294956 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.397591 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.397639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.397651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.397669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.397682 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.500767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.500850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.500865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.500890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.500904 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.604138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.604181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.604194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.604209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.604221 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.707066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.707138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.707151 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.707174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.707187 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.810436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.810486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.810498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.810515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.810527 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4743]: E0310 15:07:35.911366 4743 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.914306 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:35 crc kubenswrapper[4743]: E0310 15:07:35.914451 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.929694 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:35Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.942772 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:35Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.955713 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:35Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.972170 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:35Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.987655 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:35Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:35 crc kubenswrapper[4743]: I0310 15:07:35.998564 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:35Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.010452 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.021146 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.033035 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.047046 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.064710 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.083961 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.098907 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.112314 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.127280 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.153367 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.169377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.190089 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01fd219db8680c976c7d373bcbda8a9f0202d42a80c3bef9fb17d933c4cdfcb0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"message\\\":\\\"ointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.970423 6600 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 15:07:31.970654 6600 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.970923 6600 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:07:31.971049 6600 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:07:31.971254 6600 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 15:07:31.971620 6600 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:31.971645 6600 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:07:31.971660 6600 factory.go:656] Stopping watch factory\\\\nI0310 15:07:31.971676 6600 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:31.971712 6600 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:33Z\\\",\\\"message\\\":\\\"VN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 15:07:33.522736 6771 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:36 crc kubenswrapper[4743]: E0310 15:07:36.355013 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.915372 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.915442 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:36 crc kubenswrapper[4743]: E0310 15:07:36.916112 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:36 crc kubenswrapper[4743]: E0310 15:07:36.916177 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:36 crc kubenswrapper[4743]: I0310 15:07:36.915466 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:36 crc kubenswrapper[4743]: E0310 15:07:36.916440 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.422141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.422180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.422189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.422206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.422216 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4743]: E0310 15:07:37.435981 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.441393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.441452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.441469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.441493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.441511 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4743]: E0310 15:07:37.456786 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.462025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.462072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.462081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.462101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.462111 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4743]: E0310 15:07:37.477345 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.482855 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.483097 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.483115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.483136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.483149 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4743]: E0310 15:07:37.499307 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.504753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.504851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.504867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.504893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.504908 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4743]: E0310 15:07:37.518051 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:37 crc kubenswrapper[4743]: E0310 15:07:37.518186 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:37 crc kubenswrapper[4743]: I0310 15:07:37.914850 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:37 crc kubenswrapper[4743]: E0310 15:07:37.915066 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:38 crc kubenswrapper[4743]: I0310 15:07:38.914584 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:38 crc kubenswrapper[4743]: I0310 15:07:38.914704 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:38 crc kubenswrapper[4743]: E0310 15:07:38.914765 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:38 crc kubenswrapper[4743]: I0310 15:07:38.914587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:38 crc kubenswrapper[4743]: E0310 15:07:38.914979 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:38 crc kubenswrapper[4743]: E0310 15:07:38.915156 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:39 crc kubenswrapper[4743]: I0310 15:07:39.747596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:39 crc kubenswrapper[4743]: E0310 15:07:39.747917 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:39 crc kubenswrapper[4743]: E0310 15:07:39.748041 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs podName:acbc8434-7aab-481b-ae0e-08696da082ad nodeName:}" failed. No retries permitted until 2026-03-10 15:07:55.748012035 +0000 UTC m=+140.454826793 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs") pod "network-metrics-daemon-vcq2w" (UID: "acbc8434-7aab-481b-ae0e-08696da082ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:39 crc kubenswrapper[4743]: I0310 15:07:39.915473 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:39 crc kubenswrapper[4743]: E0310 15:07:39.915785 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:40 crc kubenswrapper[4743]: I0310 15:07:40.915303 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:40 crc kubenswrapper[4743]: I0310 15:07:40.915412 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:40 crc kubenswrapper[4743]: I0310 15:07:40.915336 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:40 crc kubenswrapper[4743]: E0310 15:07:40.915607 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:40 crc kubenswrapper[4743]: E0310 15:07:40.915734 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:40 crc kubenswrapper[4743]: E0310 15:07:40.915980 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:41 crc kubenswrapper[4743]: E0310 15:07:41.357245 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.601649 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.602686 4743 scope.go:117] "RemoveContainer" containerID="4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6" Mar 10 15:07:41 crc kubenswrapper[4743]: E0310 15:07:41.602934 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.654179 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.671677 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.689966 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.705426 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.716894 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.727599 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.743011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.761228 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.780410 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.796409 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.815652 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.836996 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.851507 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.867109 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.885270 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.914688 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:41 crc kubenswrapper[4743]: E0310 15:07:41.914905 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.915653 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.935643 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:41 crc kubenswrapper[4743]: I0310 15:07:41.959147 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:33Z\\\",\\\"message\\\":\\\"VN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 15:07:33.522736 6771 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:42 crc kubenswrapper[4743]: I0310 15:07:42.947208 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:42 crc kubenswrapper[4743]: E0310 15:07:42.947355 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:42 crc kubenswrapper[4743]: I0310 15:07:42.947789 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:42 crc kubenswrapper[4743]: E0310 15:07:42.947861 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:42 crc kubenswrapper[4743]: I0310 15:07:42.948159 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:42 crc kubenswrapper[4743]: E0310 15:07:42.948238 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:43 crc kubenswrapper[4743]: I0310 15:07:43.915071 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:43 crc kubenswrapper[4743]: E0310 15:07:43.915344 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:44 crc kubenswrapper[4743]: I0310 15:07:44.914441 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:44 crc kubenswrapper[4743]: I0310 15:07:44.914532 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:44 crc kubenswrapper[4743]: I0310 15:07:44.914584 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:44 crc kubenswrapper[4743]: E0310 15:07:44.914683 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:44 crc kubenswrapper[4743]: E0310 15:07:44.914876 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:44 crc kubenswrapper[4743]: E0310 15:07:44.915034 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:45 crc kubenswrapper[4743]: I0310 15:07:45.914458 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:45 crc kubenswrapper[4743]: E0310 15:07:45.914770 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:45 crc kubenswrapper[4743]: I0310 15:07:45.931649 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 15:07:45 crc kubenswrapper[4743]: I0310 15:07:45.944484 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4743]: I0310 15:07:45.972209 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.000358 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:33Z\\\",\\\"message\\\":\\\"VN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 15:07:33.522736 6771 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.017603 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.031244 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.060421 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.074027 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.086251 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.101055 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.118166 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.134765 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.153133 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.170694 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.188441 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.205667 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.222246 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.235457 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.252193 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4743]: E0310 15:07:46.359096 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.914952 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.915047 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:46 crc kubenswrapper[4743]: E0310 15:07:46.915168 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:46 crc kubenswrapper[4743]: I0310 15:07:46.915065 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:46 crc kubenswrapper[4743]: E0310 15:07:46.915371 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:46 crc kubenswrapper[4743]: E0310 15:07:46.915400 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.632085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.632180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.632204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.632235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.632257 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4743]: E0310 15:07:47.652355 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.658103 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.658222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.658287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.658366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.658434 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4743]: E0310 15:07:47.677537 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.683327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.683542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.683624 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.683708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.683784 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4743]: E0310 15:07:47.701999 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.707738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.707804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.707865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.707940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.707963 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4743]: E0310 15:07:47.729365 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.734546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.734606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.734631 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.734660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.734678 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4743]: E0310 15:07:47.754432 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4743]: E0310 15:07:47.754662 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:47 crc kubenswrapper[4743]: I0310 15:07:47.915012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:47 crc kubenswrapper[4743]: E0310 15:07:47.915267 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:48 crc kubenswrapper[4743]: I0310 15:07:48.915037 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:48 crc kubenswrapper[4743]: I0310 15:07:48.915075 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:48 crc kubenswrapper[4743]: I0310 15:07:48.915161 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:48 crc kubenswrapper[4743]: E0310 15:07:48.915304 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:48 crc kubenswrapper[4743]: E0310 15:07:48.915545 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:48 crc kubenswrapper[4743]: E0310 15:07:48.915690 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:49 crc kubenswrapper[4743]: I0310 15:07:49.915102 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:49 crc kubenswrapper[4743]: E0310 15:07:49.915332 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:50 crc kubenswrapper[4743]: I0310 15:07:50.915388 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:50 crc kubenswrapper[4743]: I0310 15:07:50.915499 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:50 crc kubenswrapper[4743]: I0310 15:07:50.915388 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:50 crc kubenswrapper[4743]: E0310 15:07:50.915577 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:50 crc kubenswrapper[4743]: E0310 15:07:50.915745 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:50 crc kubenswrapper[4743]: E0310 15:07:50.916032 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:51 crc kubenswrapper[4743]: E0310 15:07:51.361270 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:07:51 crc kubenswrapper[4743]: I0310 15:07:51.914949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:51 crc kubenswrapper[4743]: E0310 15:07:51.915110 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:52 crc kubenswrapper[4743]: I0310 15:07:52.915155 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:52 crc kubenswrapper[4743]: I0310 15:07:52.915155 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:52 crc kubenswrapper[4743]: I0310 15:07:52.915207 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:52 crc kubenswrapper[4743]: E0310 15:07:52.915691 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:52 crc kubenswrapper[4743]: E0310 15:07:52.915846 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:52 crc kubenswrapper[4743]: E0310 15:07:52.915993 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:52 crc kubenswrapper[4743]: I0310 15:07:52.916204 4743 scope.go:117] "RemoveContainer" containerID="4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6" Mar 10 15:07:53 crc kubenswrapper[4743]: I0310 15:07:53.131910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:53 crc kubenswrapper[4743]: I0310 15:07:53.132057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:53 crc kubenswrapper[4743]: I0310 15:07:53.132101 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.132191 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.132213 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:08:57.13216234 +0000 UTC m=+201.838977088 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.132278 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:57.132261283 +0000 UTC m=+201.839076211 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.132345 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.132469 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:57.132445628 +0000 UTC m=+201.839260376 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:53 crc kubenswrapper[4743]: I0310 15:07:53.233894 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:53 crc kubenswrapper[4743]: I0310 15:07:53.233997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.234161 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.234167 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.234214 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.234227 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.234182 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.234250 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.234291 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:57.234272789 +0000 UTC m=+201.941087637 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.234323 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:57.23430651 +0000 UTC m=+201.941121268 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:53 crc kubenswrapper[4743]: I0310 15:07:53.915131 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:53 crc kubenswrapper[4743]: E0310 15:07:53.915860 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:53 crc kubenswrapper[4743]: I0310 15:07:53.998998 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/1.log" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.001870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e"} Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.002561 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.018380 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.030671 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.047734 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:33Z\\\",\\\"message\\\":\\\"VN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 15:07:33.522736 6771 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.070730 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.080532 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.092638 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.101901 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.111738 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.124732 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.139469 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.154313 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.342495 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.356652 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.369480 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.378601 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.388563 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.400131 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.410851 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.420804 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.915250 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:54 crc kubenswrapper[4743]: E0310 15:07:54.915491 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.915555 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:54 crc kubenswrapper[4743]: I0310 15:07:54.915612 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:54 crc kubenswrapper[4743]: E0310 15:07:54.915752 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:54 crc kubenswrapper[4743]: E0310 15:07:54.915885 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.008456 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/2.log" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.009305 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/1.log" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.012053 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e" exitCode=1 Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.012122 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e"} Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.012189 4743 scope.go:117] "RemoveContainer" containerID="4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.012780 4743 scope.go:117] "RemoveContainer" containerID="6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e" Mar 10 15:07:55 crc kubenswrapper[4743]: E0310 15:07:55.012962 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.028989 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.045180 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.060224 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.072706 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.083724 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.097665 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.110833 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.125899 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.143573 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.156288 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.169229 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.183446 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.205842 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:33Z\\\",\\\"message\\\":\\\"VN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 15:07:33.522736 6771 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.228856 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.242769 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.259502 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.274776 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.285965 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.298276 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.840165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:55 crc kubenswrapper[4743]: E0310 15:07:55.840413 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:55 crc kubenswrapper[4743]: E0310 15:07:55.840709 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs podName:acbc8434-7aab-481b-ae0e-08696da082ad nodeName:}" failed. No retries permitted until 2026-03-10 15:08:27.84049447 +0000 UTC m=+172.547309248 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs") pod "network-metrics-daemon-vcq2w" (UID: "acbc8434-7aab-481b-ae0e-08696da082ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.915021 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:55 crc kubenswrapper[4743]: E0310 15:07:55.915402 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.948981 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.966566 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4743]: I0310 15:07:55.985130 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.006440 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4819490ea04e585abe6d5f2e329c52e7f2d2a5d855da32df062f01ae674a64e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:33Z\\\",\\\"message\\\":\\\"VN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 15:07:33.522736 6771 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate ha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.017293 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/2.log" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.021185 4743 scope.go:117] "RemoveContainer" containerID="6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e" Mar 10 15:07:56 crc kubenswrapper[4743]: E0310 15:07:56.021377 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.023305 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.047657 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.061451 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.076356 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.088780 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.101666 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.116083 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.131084 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.146625 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.159143 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.173146 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.185754 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.198970 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.211598 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.223888 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.240644 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.255234 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.270769 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.286755 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.304225 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.319472 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.332434 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.346779 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.357910 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: E0310 15:07:56.362807 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.371873 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.385766 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.400913 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.422867 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.455698 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.473917 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.493516 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.512220 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.526111 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.538348 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.914956 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.914967 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:56 crc kubenswrapper[4743]: I0310 15:07:56.915183 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:56 crc kubenswrapper[4743]: E0310 15:07:56.915485 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:56 crc kubenswrapper[4743]: E0310 15:07:56.915596 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:56 crc kubenswrapper[4743]: E0310 15:07:56.915771 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:57 crc kubenswrapper[4743]: I0310 15:07:57.915204 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:57 crc kubenswrapper[4743]: E0310 15:07:57.915397 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.047990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.048036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.048049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.048066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.048076 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.069652 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:58Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.075551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.075602 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.075610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.075629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.075640 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.097897 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:58Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.104147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.104203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.104217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.104240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.104257 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.125743 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:58Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.132644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.132694 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.132704 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.132722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.132732 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.151535 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:58Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.156910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.156949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.156959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.156977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.156988 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.172439 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:58Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.172562 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.914748 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.914857 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.914914 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:58 crc kubenswrapper[4743]: I0310 15:07:58.915012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.915111 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:58 crc kubenswrapper[4743]: E0310 15:07:58.915239 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:59 crc kubenswrapper[4743]: I0310 15:07:59.915295 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:07:59 crc kubenswrapper[4743]: E0310 15:07:59.915536 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:00 crc kubenswrapper[4743]: I0310 15:08:00.915248 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:00 crc kubenswrapper[4743]: I0310 15:08:00.915315 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:00 crc kubenswrapper[4743]: I0310 15:08:00.915362 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:00 crc kubenswrapper[4743]: E0310 15:08:00.915449 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:00 crc kubenswrapper[4743]: E0310 15:08:00.915636 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:00 crc kubenswrapper[4743]: E0310 15:08:00.915799 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:01 crc kubenswrapper[4743]: E0310 15:08:01.364984 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:01 crc kubenswrapper[4743]: I0310 15:08:01.914885 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:01 crc kubenswrapper[4743]: E0310 15:08:01.915063 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:02 crc kubenswrapper[4743]: I0310 15:08:02.914554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:02 crc kubenswrapper[4743]: I0310 15:08:02.914554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:02 crc kubenswrapper[4743]: E0310 15:08:02.914746 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:02 crc kubenswrapper[4743]: I0310 15:08:02.914587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:02 crc kubenswrapper[4743]: E0310 15:08:02.914991 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:02 crc kubenswrapper[4743]: E0310 15:08:02.914904 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:03 crc kubenswrapper[4743]: I0310 15:08:03.915318 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:03 crc kubenswrapper[4743]: E0310 15:08:03.915633 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:04 crc kubenswrapper[4743]: I0310 15:08:04.915265 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:04 crc kubenswrapper[4743]: I0310 15:08:04.915378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:04 crc kubenswrapper[4743]: I0310 15:08:04.915284 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:04 crc kubenswrapper[4743]: E0310 15:08:04.915523 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:04 crc kubenswrapper[4743]: E0310 15:08:04.915669 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:04 crc kubenswrapper[4743]: E0310 15:08:04.915715 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:05 crc kubenswrapper[4743]: I0310 15:08:05.915196 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:05 crc kubenswrapper[4743]: E0310 15:08:05.915446 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:05 crc kubenswrapper[4743]: I0310 15:08:05.941990 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:05 crc kubenswrapper[4743]: I0310 15:08:05.978046 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:05 crc kubenswrapper[4743]: I0310 15:08:05.993231 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.014226 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.036639 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.061275 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.078666 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.094631 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.115998 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.133416 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.156107 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.173276 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.186699 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.198159 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.212777 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.227784 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.241758 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.253704 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.267004 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:06 crc kubenswrapper[4743]: E0310 15:08:06.366263 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.915278 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.915278 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.915344 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:06 crc kubenswrapper[4743]: E0310 15:08:06.915976 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:06 crc kubenswrapper[4743]: E0310 15:08:06.916054 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:06 crc kubenswrapper[4743]: E0310 15:08:06.916144 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:06 crc kubenswrapper[4743]: I0310 15:08:06.916617 4743 scope.go:117] "RemoveContainer" containerID="6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e" Mar 10 15:08:06 crc kubenswrapper[4743]: E0310 15:08:06.917151 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:08:07 crc kubenswrapper[4743]: I0310 15:08:07.915256 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:07 crc kubenswrapper[4743]: E0310 15:08:07.915494 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.481938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.482070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.482088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.482115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.482134 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:08Z","lastTransitionTime":"2026-03-10T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.503959 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.510212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.510310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.510329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.510356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.510375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:08Z","lastTransitionTime":"2026-03-10T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.528222 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.534139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.534190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.534204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.534224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.534241 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:08Z","lastTransitionTime":"2026-03-10T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.553524 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.559601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.559655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.559673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.559699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.559715 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:08Z","lastTransitionTime":"2026-03-10T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.577386 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.582202 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.582245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.582261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.582283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.582297 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:08Z","lastTransitionTime":"2026-03-10T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.595627 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.595791 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.914978 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.915002 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.915098 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:08 crc kubenswrapper[4743]: I0310 15:08:08.915127 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.915330 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:08 crc kubenswrapper[4743]: E0310 15:08:08.915422 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:09 crc kubenswrapper[4743]: I0310 15:08:09.915350 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:09 crc kubenswrapper[4743]: E0310 15:08:09.916055 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.078903 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/0.log" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.078960 4743 generic.go:334] "Generic (PLEG): container finished" podID="1736aae6-d840-4b31-8c44-6637a05f37ef" containerID="403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e" exitCode=1 Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.078993 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgbfn" event={"ID":"1736aae6-d840-4b31-8c44-6637a05f37ef","Type":"ContainerDied","Data":"403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e"} Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.079442 4743 scope.go:117] "RemoveContainer" containerID="403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.095511 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.114072 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.129981 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.145786 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.157868 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.170434 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.181490 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.191723 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.203778 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.214957 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.238004 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.254377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.275556 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.304196 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.320782 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.334741 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.351164 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.363664 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:09Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340\\\\n2026-03-10T15:07:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340 to /host/opt/cni/bin/\\\\n2026-03-10T15:07:24Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:24Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.374357 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.914843 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.914986 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:10 crc kubenswrapper[4743]: I0310 15:08:10.915035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:10 crc kubenswrapper[4743]: E0310 15:08:10.915181 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:10 crc kubenswrapper[4743]: E0310 15:08:10.915404 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:10 crc kubenswrapper[4743]: E0310 15:08:10.915605 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.091107 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/0.log" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.091272 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgbfn" event={"ID":"1736aae6-d840-4b31-8c44-6637a05f37ef","Type":"ContainerStarted","Data":"88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe"} Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.117526 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.136735 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.161364 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.185625 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.203995 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.219690 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.239518 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.263188 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:09Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340\\\\n2026-03-10T15:07:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340 to /host/opt/cni/bin/\\\\n2026-03-10T15:07:24Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:24Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.278259 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.295730 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.315184 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.334994 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.353995 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: E0310 15:08:11.367986 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.374069 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.411778 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.440737 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.468343 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.484985 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.498707 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4743]: I0310 15:08:11.915006 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:11 crc kubenswrapper[4743]: E0310 15:08:11.915279 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:12 crc kubenswrapper[4743]: I0310 15:08:12.914854 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:12 crc kubenswrapper[4743]: I0310 15:08:12.914965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:12 crc kubenswrapper[4743]: E0310 15:08:12.915025 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:12 crc kubenswrapper[4743]: I0310 15:08:12.914965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:12 crc kubenswrapper[4743]: E0310 15:08:12.915202 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:12 crc kubenswrapper[4743]: E0310 15:08:12.915322 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:13 crc kubenswrapper[4743]: I0310 15:08:13.915842 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:13 crc kubenswrapper[4743]: E0310 15:08:13.916210 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:14 crc kubenswrapper[4743]: I0310 15:08:14.915306 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:14 crc kubenswrapper[4743]: I0310 15:08:14.915356 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:14 crc kubenswrapper[4743]: I0310 15:08:14.915392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:14 crc kubenswrapper[4743]: E0310 15:08:14.915564 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:14 crc kubenswrapper[4743]: E0310 15:08:14.915682 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:14 crc kubenswrapper[4743]: E0310 15:08:14.915921 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:15 crc kubenswrapper[4743]: I0310 15:08:15.914949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:15 crc kubenswrapper[4743]: E0310 15:08:15.915936 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:15 crc kubenswrapper[4743]: I0310 15:08:15.932991 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:15 crc kubenswrapper[4743]: I0310 15:08:15.949901 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:15 crc kubenswrapper[4743]: I0310 15:08:15.973351 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.005583 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.042683 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.065636 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.088482 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.103861 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:09Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340\\\\n2026-03-10T15:07:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340 to /host/opt/cni/bin/\\\\n2026-03-10T15:07:24Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:24Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.119993 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.138315 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.154040 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.171121 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.186316 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.202712 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.214052 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.226845 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.237534 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.251312 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.273130 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4743]: E0310 15:08:16.369701 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.914413 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.914582 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:16 crc kubenswrapper[4743]: E0310 15:08:16.914675 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:16 crc kubenswrapper[4743]: I0310 15:08:16.914697 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:16 crc kubenswrapper[4743]: E0310 15:08:16.914912 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:16 crc kubenswrapper[4743]: E0310 15:08:16.915020 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:17 crc kubenswrapper[4743]: I0310 15:08:17.915404 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:17 crc kubenswrapper[4743]: E0310 15:08:17.915682 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:17 crc kubenswrapper[4743]: I0310 15:08:17.917139 4743 scope.go:117] "RemoveContainer" containerID="6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.121710 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/2.log" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.126403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d"} Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.127339 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.158014 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.173781 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.191887 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.218655 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.235932 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.251182 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.267676 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.283126 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:09Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340\\\\n2026-03-10T15:07:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340 to /host/opt/cni/bin/\\\\n2026-03-10T15:07:24Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:24Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.318958 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.331780 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.343664 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.362548 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.387206 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.410463 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.465479 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.479602 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.496436 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.511605 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.524127 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.681226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.681292 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.681305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.681333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.681346 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:18Z","lastTransitionTime":"2026-03-10T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.695846 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.700202 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.700261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.700280 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.700307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.700326 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:18Z","lastTransitionTime":"2026-03-10T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.723588 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.728936 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.728974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.728986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.729005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.729019 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:18Z","lastTransitionTime":"2026-03-10T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.742724 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.748086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.748117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.748128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.748147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.748161 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:18Z","lastTransitionTime":"2026-03-10T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.760143 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.765362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.765395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.765408 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.765426 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.765438 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:18Z","lastTransitionTime":"2026-03-10T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.778034 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e532390-ce89-4ad9-81e4-f384a9988976\\\",\\\"systemUUID\\\":\\\"d399f706-59cf-40ea-a3ad-c58404098384\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.778256 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.915085 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.915124 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:18 crc kubenswrapper[4743]: I0310 15:08:18.915159 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.915439 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.915569 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:18 crc kubenswrapper[4743]: E0310 15:08:18.915672 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.132504 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/3.log" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.133326 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/2.log" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.137903 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d" exitCode=1 Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.137970 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d"} Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.138035 4743 scope.go:117] "RemoveContainer" containerID="6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.139493 4743 scope.go:117] "RemoveContainer" containerID="7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d" Mar 10 15:08:19 crc kubenswrapper[4743]: E0310 15:08:19.139857 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.169208 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.189789 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.207376 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.226529 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b259247db29f7e873753d04496e9ffdf9c0d3585db282b87ee0c056129e0e9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"15:07:54.413289 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:54.413256 7022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:54.413312 7022 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:07:54.413351 7022 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:54.413378 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:54.413395 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:54.413430 7022 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:07:54.413433 7022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:54.413481 7022 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:54.413512 7022 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:07:54.413541 7022 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:54.413551 7022 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:54.413564 7022 factory.go:656] Stopping watch factory\\\\nI0310 15:07:54.413579 7022 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:07:54.413589 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:07:54.413598 7022 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:08:18.886046 7302 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.886273 7302 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.886636 7302 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887255 7302 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887372 7302 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887603 7302 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.904994 7302 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 15:08:18.905016 7302 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 15:08:18.905070 7302 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:08:18.905103 7302 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:08:18.905192 7302 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.239894 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.251390 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.268500 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.284671 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:09Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340\\\\n2026-03-10T15:07:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340 to /host/opt/cni/bin/\\\\n2026-03-10T15:07:24Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:24Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.298571 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.314783 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.330146 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.345087 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.361143 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.378773 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.392480 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.403039 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.413717 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.423128 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.436790 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4743]: I0310 15:08:19.914912 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:19 crc kubenswrapper[4743]: E0310 15:08:19.915121 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.147466 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/3.log" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.151872 4743 scope.go:117] "RemoveContainer" containerID="7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d" Mar 10 15:08:20 crc kubenswrapper[4743]: E0310 15:08:20.152451 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.165945 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.179721 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.195843 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.211186 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.230463 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.245735 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.262921 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.277943 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.294125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.311388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.338296 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.355748 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.373938 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.404920 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:08:18.886046 7302 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.886273 7302 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.886636 7302 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887255 7302 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887372 7302 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887603 7302 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.904994 7302 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 15:08:18.905016 7302 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 15:08:18.905070 7302 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:08:18.905103 7302 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:08:18.905192 7302 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.425231 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.442078 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.469223 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.488928 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:09Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340\\\\n2026-03-10T15:07:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340 to /host/opt/cni/bin/\\\\n2026-03-10T15:07:24Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:24Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.507315 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.915188 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.915343 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:20 crc kubenswrapper[4743]: I0310 15:08:20.915417 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:20 crc kubenswrapper[4743]: E0310 15:08:20.915561 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:20 crc kubenswrapper[4743]: E0310 15:08:20.915659 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:20 crc kubenswrapper[4743]: E0310 15:08:20.915929 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:21 crc kubenswrapper[4743]: E0310 15:08:21.370540 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:21 crc kubenswrapper[4743]: I0310 15:08:21.915054 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:21 crc kubenswrapper[4743]: E0310 15:08:21.916012 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:22 crc kubenswrapper[4743]: I0310 15:08:22.915200 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:22 crc kubenswrapper[4743]: I0310 15:08:22.915332 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:22 crc kubenswrapper[4743]: E0310 15:08:22.915413 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:22 crc kubenswrapper[4743]: I0310 15:08:22.915200 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:22 crc kubenswrapper[4743]: E0310 15:08:22.915782 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:22 crc kubenswrapper[4743]: E0310 15:08:22.915879 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:23 crc kubenswrapper[4743]: I0310 15:08:23.915149 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:23 crc kubenswrapper[4743]: E0310 15:08:23.915405 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:24 crc kubenswrapper[4743]: I0310 15:08:24.915387 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:24 crc kubenswrapper[4743]: I0310 15:08:24.915558 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:24 crc kubenswrapper[4743]: E0310 15:08:24.915675 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:24 crc kubenswrapper[4743]: I0310 15:08:24.915558 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:24 crc kubenswrapper[4743]: E0310 15:08:24.915868 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:24 crc kubenswrapper[4743]: E0310 15:08:24.916042 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:25 crc kubenswrapper[4743]: I0310 15:08:25.915559 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:25 crc kubenswrapper[4743]: E0310 15:08:25.915906 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:25 crc kubenswrapper[4743]: I0310 15:08:25.936214 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acbc8434-7aab-481b-ae0e-08696da082ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rblb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcq2w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:25 crc kubenswrapper[4743]: I0310 15:08:25.959101 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:25 crc kubenswrapper[4743]: I0310 15:08:25.977548 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d049bbf-95c6-4135-8808-1e453cf59a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9169757ac426ce68e171f74622b2e5f34654979ea9827e740de154ec15edad01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbln8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qrnln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.002404 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-s46xz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32b91cde-a621-4d27-a253-12a8effb3b0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016241ca5c0e5453f40c16a283faaf45cb939fea4a9bc27eaa806534006a8fc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e987889f4fa38ec490aef3983b0851fc775d9a36bcd21c64c8cfaf5ea9cb488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a57b4d4c6f85a991e015898f9e199dc9a7fa01b977c63c969f5b7f8cc72e3ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f1c44445621faf0244d4447c5163bcb64d96300a05668904c270ae92dcca87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7486a73e6c87d81209e12594f8fbe103a25cfbf78983d031bf4ea589bbf08dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a591d0628183b1e55ff36a1074e82c93141ab2c5710c174bd530b5fc6c03b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25d5fa11d44da46c87eaf05fc7cb345ed9891975ba2dfa3e8722cfc60ed16de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-md2gq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-s46xz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.025859 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1736aae6-d840-4b31-8c44-6637a05f37ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:09Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340\\\\n2026-03-10T15:07:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf988830-31d1-44cb-8868-77c092d5d340 to /host/opt/cni/bin/\\\\n2026-03-10T15:07:24Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:24Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6g99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.046525 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.062454 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://086e517d2ce472de2ac39390066de881ce205702ac5ca9533154aadb4d1382dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.078069 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t9mrg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d36817c-a2b7-49c1-92a9-2f9c54fd4f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4114859be568a5291d31b7831fc9937f187ab1061604a71e24bdcb9e2b20e9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjc2x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t9mrg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.090973 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rrjqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ba758a8-e906-48f3-ae10-c045f908306e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bb2ae21522fda9b0e8458fbac92aa83220832c092ccd23bf6918146dcd96032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqbpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rrjqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.112528 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6705d654-8d5a-4baa-ad0f-8baa7799965d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189ab04179aa2349bcc8b74507d505ef30674163951d930771f15f8d68af105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d3bb5b8628f193873ea348e4f9c8e28a45ed97cf85e8a827441b55f5e250c33\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:05:38.163716 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:05:38.167714 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:05:38.207465 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:05:38.212464 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:06:08.387861 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ab02d6f2fa6bf349bd0357ce828b043aa7afba72c1d125d1f67006ab880bf64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4878fa41ed681803135adcc96d283575cdb0d73c176ae0fda0276f37ae6f92a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.136482 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f090627-57d4-471b-810a-540b21da2e8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:06:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:06:32.383456 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:06:32.383739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:06:32.385579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3354306589/tls.crt::/tmp/serving-cert-3354306589/tls.key\\\\\\\"\\\\nI0310 15:06:32.808948 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:06:32.819185 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:06:32.819214 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:06:32.819240 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:06:32.819246 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:06:32.823848 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0310 15:06:32.823847 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:06:32.825328 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825373 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:06:32.825410 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 15:06:32.825445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:06:32.825479 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:06:32.825511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:06:32.827602 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.155221 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b084ddbe7ab8c55c930ffdd6685277177e530c0d82df769067a566d65ddc9295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9742a35338caac6933ffc1f6faaa342ff37dc75642add2cd34320d0776d93dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.177137 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.190566 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6db778-e90a-4917-8c4a-47d6a17e8e06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c56ba1fb566ac7fd98f65b32b220d15862ba89043075c55673771e01ad6c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733245f3ecfa99f8aeb36647855afb9a50d246d9935e1bad1a3fe76a16644898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.205994 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5781a455-9df4-408a-9b78-9055f53dc9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f2b1507b76c239e4ce5c5e5d3cefd1b6c13f965263f6ac567cb865f66bc949e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80b6736b7d3b6b16d5c8a6c1b80f5ec550f258464c957692b0948526fe28b239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nmxc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.238417 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3cf0a77-a56b-463a-a655-efff86fbd815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e82445a0b55999e28b658a7b7717532a05fdf348190ab7c83dff791720bc294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba22b8df4268fadb57457cb19353e2a601b642141f975ab0974ce4a35fc366ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4606f8317b1c62cf2cec85eb81c78079a5db8207fc5e580b9b9e6834607a1529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e4eb566af987ef1b46a35d146970085117cb707a042a4b3572314fcf13d412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70409030a0714c1c07de4782d62b38f05d22557bd72dacdff58055d47fd3d2db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b997bc88d2ceebece2b866395a38b66a9c43cc934f9c44e1e68cc7b46c0517d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f2eeef7a1c0bd303f51321e21c500edfe079a170cd5cdc2def9834ab2acd0b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f707dab5f0b76a4bbca1079313f3aa8cd392791f3148932679c5528364b931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.256719 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0f4ebde-ac7d-410f-ad0c-dcb0d2780903\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4699dc772cf31c2bac2713af95173ef83f81052e785e87a347054de023e4a9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48922ca169ceea9b854260d77fed3b67d22f9561e56f26d22d7aa5c3f350b247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42279c3f723e62268d8ac07aca4cea6f066ef44fe725e33f6b1d40fd5138d8f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44fd45ef597eb5ad23d1f256849f27ffa23562749dbb524b204a4cd65932d5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.276225 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fffac3c3b69c3668687fcc35124dae75becb20c80c7f4e111416f386f0ae467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.303437 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91ad6254-92fa-4092-8b86-2393f317f163\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:18Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:08:18.886046 7302 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.886273 7302 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.886636 7302 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887255 7302 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887372 7302 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.887603 7302 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:08:18.904994 7302 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 15:08:18.905016 7302 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 15:08:18.905070 7302 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:08:18.905103 7302 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:08:18.905192 7302 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnddq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dxdms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4743]: E0310 15:08:26.371712 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.914714 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.914862 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:26 crc kubenswrapper[4743]: E0310 15:08:26.914967 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:26 crc kubenswrapper[4743]: E0310 15:08:26.915083 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:26 crc kubenswrapper[4743]: I0310 15:08:26.915476 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:26 crc kubenswrapper[4743]: E0310 15:08:26.915583 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:27 crc kubenswrapper[4743]: I0310 15:08:27.915621 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:27 crc kubenswrapper[4743]: E0310 15:08:27.916084 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:27 crc kubenswrapper[4743]: I0310 15:08:27.928485 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:27 crc kubenswrapper[4743]: E0310 15:08:27.928810 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:27 crc kubenswrapper[4743]: E0310 15:08:27.928941 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs podName:acbc8434-7aab-481b-ae0e-08696da082ad nodeName:}" failed. No retries permitted until 2026-03-10 15:09:31.928912952 +0000 UTC m=+236.635727730 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs") pod "network-metrics-daemon-vcq2w" (UID: "acbc8434-7aab-481b-ae0e-08696da082ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:28 crc kubenswrapper[4743]: I0310 15:08:28.915054 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:28 crc kubenswrapper[4743]: E0310 15:08:28.915457 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:28 crc kubenswrapper[4743]: I0310 15:08:28.915105 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:28 crc kubenswrapper[4743]: E0310 15:08:28.915713 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:28 crc kubenswrapper[4743]: I0310 15:08:28.915105 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:28 crc kubenswrapper[4743]: E0310 15:08:28.916202 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.045082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.045147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.045169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.045194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.045215 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:29Z","lastTransitionTime":"2026-03-10T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.104831 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp"] Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.105268 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.106883 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.107630 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.108429 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.108447 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.137212 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=81.137184385 podStartE2EDuration="1m21.137184385s" podCreationTimestamp="2026-03-10 15:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.124576283 +0000 UTC m=+173.831391061" watchObservedRunningTime="2026-03-10 15:08:29.137184385 +0000 UTC m=+173.843999153" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.152516 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nmxc7" podStartSLOduration=126.152491912 podStartE2EDuration="2m6.152491912s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.1376422 +0000 UTC m=+173.844456948" watchObservedRunningTime="2026-03-10 15:08:29.152491912 +0000 UTC m=+173.859306650" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.185618 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.185600143 podStartE2EDuration="1m7.185600143s" podCreationTimestamp="2026-03-10 15:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.185585212 +0000 UTC m=+173.892399960" watchObservedRunningTime="2026-03-10 15:08:29.185600143 +0000 UTC m=+173.892414891" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.198372 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.19834937 podStartE2EDuration="44.19834937s" podCreationTimestamp="2026-03-10 15:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.198209765 +0000 UTC m=+173.905024513" watchObservedRunningTime="2026-03-10 15:08:29.19834937 +0000 UTC m=+173.905164128" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.243164 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e2abe04-508e-431c-878c-dfd85b46f578-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.243219 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e2abe04-508e-431c-878c-dfd85b46f578-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.243244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2abe04-508e-431c-878c-dfd85b46f578-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.243281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e2abe04-508e-431c-878c-dfd85b46f578-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.243324 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e2abe04-508e-431c-878c-dfd85b46f578-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.263502 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podStartSLOduration=126.263481368 podStartE2EDuration="2m6.263481368s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.263273401 +0000 UTC m=+173.970088149" watchObservedRunningTime="2026-03-10 15:08:29.263481368 +0000 UTC m=+173.970296116" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.280440 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s46xz" podStartSLOduration=126.280424096 podStartE2EDuration="2m6.280424096s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.279965591 +0000 UTC m=+173.986780359" watchObservedRunningTime="2026-03-10 15:08:29.280424096 +0000 UTC m=+173.987238844" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.303449 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vgbfn" podStartSLOduration=126.303425812 podStartE2EDuration="2m6.303425812s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.293612816 +0000 UTC m=+174.000427564" watchObservedRunningTime="2026-03-10 15:08:29.303425812 +0000 UTC m=+174.010240560" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.314957 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t9mrg" podStartSLOduration=126.31493562 podStartE2EDuration="2m6.31493562s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.314910269 +0000 UTC m=+174.021725017" watchObservedRunningTime="2026-03-10 15:08:29.31493562 +0000 UTC m=+174.021750368" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.326756 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rrjqn" podStartSLOduration=126.326734307 podStartE2EDuration="2m6.326734307s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.326086367 +0000 UTC m=+174.032901115" watchObservedRunningTime="2026-03-10 15:08:29.326734307 +0000 UTC m=+174.033549055" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.341655 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.341627651 podStartE2EDuration="1m6.341627651s" podCreationTimestamp="2026-03-10 15:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.340794315 +0000 UTC m=+174.047609103" watchObservedRunningTime="2026-03-10 15:08:29.341627651 +0000 UTC m=+174.048442409" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.343880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e2abe04-508e-431c-878c-dfd85b46f578-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.343935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e2abe04-508e-431c-878c-dfd85b46f578-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.343962 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2abe04-508e-431c-878c-dfd85b46f578-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.343999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e2abe04-508e-431c-878c-dfd85b46f578-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.344044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e2abe04-508e-431c-878c-dfd85b46f578-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.344082 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e2abe04-508e-431c-878c-dfd85b46f578-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.344238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e2abe04-508e-431c-878c-dfd85b46f578-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.345129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e2abe04-508e-431c-878c-dfd85b46f578-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.350572 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2abe04-508e-431c-878c-dfd85b46f578-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.360795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e2abe04-508e-431c-878c-dfd85b46f578-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kqswp\" (UID: \"8e2abe04-508e-431c-878c-dfd85b46f578\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.375584 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=100.375552388 podStartE2EDuration="1m40.375552388s" podCreationTimestamp="2026-03-10 15:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:29.357511166 +0000 UTC m=+174.064325934" watchObservedRunningTime="2026-03-10 15:08:29.375552388 +0000 UTC m=+174.082367136" Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.392081 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.399456 4743 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.420024 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" Mar 10 15:08:29 crc kubenswrapper[4743]: W0310 15:08:29.435140 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2abe04_508e_431c_878c_dfd85b46f578.slice/crio-63fb3fa37cbbad9b315227a503b11c789a5b9dd0497cc9862adb0db4cb154951 WatchSource:0}: Error finding container 63fb3fa37cbbad9b315227a503b11c789a5b9dd0497cc9862adb0db4cb154951: Status 404 returned error can't find the container with id 63fb3fa37cbbad9b315227a503b11c789a5b9dd0497cc9862adb0db4cb154951 Mar 10 15:08:29 crc kubenswrapper[4743]: I0310 15:08:29.915263 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:29 crc kubenswrapper[4743]: E0310 15:08:29.915679 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:30 crc kubenswrapper[4743]: I0310 15:08:30.189416 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" event={"ID":"8e2abe04-508e-431c-878c-dfd85b46f578","Type":"ContainerStarted","Data":"7e2c6a586fd3af3b68396f84d80447abe8d2baca0d832c0bd2a40cef68ba66d3"} Mar 10 15:08:30 crc kubenswrapper[4743]: I0310 15:08:30.189514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" event={"ID":"8e2abe04-508e-431c-878c-dfd85b46f578","Type":"ContainerStarted","Data":"63fb3fa37cbbad9b315227a503b11c789a5b9dd0497cc9862adb0db4cb154951"} Mar 10 15:08:30 crc kubenswrapper[4743]: I0310 15:08:30.207523 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kqswp" podStartSLOduration=127.207502472 podStartE2EDuration="2m7.207502472s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:30.206958325 +0000 UTC m=+174.913773083" watchObservedRunningTime="2026-03-10 15:08:30.207502472 +0000 UTC m=+174.914317220" Mar 10 15:08:30 crc kubenswrapper[4743]: I0310 15:08:30.914656 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:30 crc kubenswrapper[4743]: I0310 15:08:30.914656 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:30 crc kubenswrapper[4743]: E0310 15:08:30.914850 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:30 crc kubenswrapper[4743]: I0310 15:08:30.914676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:30 crc kubenswrapper[4743]: E0310 15:08:30.914930 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:30 crc kubenswrapper[4743]: E0310 15:08:30.915016 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:31 crc kubenswrapper[4743]: E0310 15:08:31.373383 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:31 crc kubenswrapper[4743]: I0310 15:08:31.915411 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:31 crc kubenswrapper[4743]: E0310 15:08:31.915634 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:32 crc kubenswrapper[4743]: I0310 15:08:32.914538 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:32 crc kubenswrapper[4743]: I0310 15:08:32.914690 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:32 crc kubenswrapper[4743]: I0310 15:08:32.914767 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:32 crc kubenswrapper[4743]: E0310 15:08:32.914792 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:32 crc kubenswrapper[4743]: E0310 15:08:32.914972 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:32 crc kubenswrapper[4743]: E0310 15:08:32.915068 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:33 crc kubenswrapper[4743]: I0310 15:08:33.915649 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:33 crc kubenswrapper[4743]: E0310 15:08:33.916173 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:34 crc kubenswrapper[4743]: I0310 15:08:34.914889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:34 crc kubenswrapper[4743]: I0310 15:08:34.915046 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:34 crc kubenswrapper[4743]: I0310 15:08:34.915117 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:34 crc kubenswrapper[4743]: E0310 15:08:34.916040 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:34 crc kubenswrapper[4743]: E0310 15:08:34.916189 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:34 crc kubenswrapper[4743]: I0310 15:08:34.916342 4743 scope.go:117] "RemoveContainer" containerID="7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d" Mar 10 15:08:34 crc kubenswrapper[4743]: E0310 15:08:34.916391 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:34 crc kubenswrapper[4743]: E0310 15:08:34.918207 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:08:35 crc kubenswrapper[4743]: I0310 15:08:35.916115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:35 crc kubenswrapper[4743]: E0310 15:08:35.917476 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:36 crc kubenswrapper[4743]: E0310 15:08:36.374887 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:36 crc kubenswrapper[4743]: I0310 15:08:36.919596 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:36 crc kubenswrapper[4743]: I0310 15:08:36.919701 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:36 crc kubenswrapper[4743]: I0310 15:08:36.919660 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:36 crc kubenswrapper[4743]: E0310 15:08:36.920168 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:36 crc kubenswrapper[4743]: E0310 15:08:36.920254 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:36 crc kubenswrapper[4743]: E0310 15:08:36.920412 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:37 crc kubenswrapper[4743]: I0310 15:08:37.914947 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:37 crc kubenswrapper[4743]: E0310 15:08:37.915288 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:38 crc kubenswrapper[4743]: I0310 15:08:38.915376 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:38 crc kubenswrapper[4743]: I0310 15:08:38.915542 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:38 crc kubenswrapper[4743]: I0310 15:08:38.915451 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:38 crc kubenswrapper[4743]: E0310 15:08:38.915705 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:38 crc kubenswrapper[4743]: E0310 15:08:38.915902 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:38 crc kubenswrapper[4743]: E0310 15:08:38.916168 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:39 crc kubenswrapper[4743]: I0310 15:08:39.914946 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:39 crc kubenswrapper[4743]: E0310 15:08:39.915162 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:40 crc kubenswrapper[4743]: I0310 15:08:40.915254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:40 crc kubenswrapper[4743]: I0310 15:08:40.915354 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:40 crc kubenswrapper[4743]: I0310 15:08:40.915470 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:40 crc kubenswrapper[4743]: E0310 15:08:40.915678 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:40 crc kubenswrapper[4743]: E0310 15:08:40.915804 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:40 crc kubenswrapper[4743]: E0310 15:08:40.915977 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:41 crc kubenswrapper[4743]: E0310 15:08:41.376000 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:41 crc kubenswrapper[4743]: I0310 15:08:41.914952 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:41 crc kubenswrapper[4743]: E0310 15:08:41.915211 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:42 crc kubenswrapper[4743]: I0310 15:08:42.914414 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:42 crc kubenswrapper[4743]: I0310 15:08:42.914455 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:42 crc kubenswrapper[4743]: I0310 15:08:42.914801 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:42 crc kubenswrapper[4743]: E0310 15:08:42.914980 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:42 crc kubenswrapper[4743]: E0310 15:08:42.915135 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:42 crc kubenswrapper[4743]: E0310 15:08:42.915349 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:43 crc kubenswrapper[4743]: I0310 15:08:43.914487 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:43 crc kubenswrapper[4743]: E0310 15:08:43.914716 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:44 crc kubenswrapper[4743]: I0310 15:08:44.915265 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:44 crc kubenswrapper[4743]: I0310 15:08:44.915371 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:44 crc kubenswrapper[4743]: I0310 15:08:44.915378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:44 crc kubenswrapper[4743]: E0310 15:08:44.915546 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:44 crc kubenswrapper[4743]: E0310 15:08:44.915647 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:44 crc kubenswrapper[4743]: E0310 15:08:44.915882 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:45 crc kubenswrapper[4743]: I0310 15:08:45.914838 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:45 crc kubenswrapper[4743]: E0310 15:08:45.917021 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:46 crc kubenswrapper[4743]: E0310 15:08:46.377446 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:46 crc kubenswrapper[4743]: I0310 15:08:46.914743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:46 crc kubenswrapper[4743]: I0310 15:08:46.914934 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:46 crc kubenswrapper[4743]: I0310 15:08:46.914763 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:46 crc kubenswrapper[4743]: E0310 15:08:46.915056 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:46 crc kubenswrapper[4743]: E0310 15:08:46.915376 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:46 crc kubenswrapper[4743]: E0310 15:08:46.915534 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:47 crc kubenswrapper[4743]: I0310 15:08:47.914991 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:47 crc kubenswrapper[4743]: E0310 15:08:47.915907 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:47 crc kubenswrapper[4743]: I0310 15:08:47.916367 4743 scope.go:117] "RemoveContainer" containerID="7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d" Mar 10 15:08:47 crc kubenswrapper[4743]: E0310 15:08:47.916672 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dxdms_openshift-ovn-kubernetes(91ad6254-92fa-4092-8b86-2393f317f163)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" Mar 10 15:08:48 crc kubenswrapper[4743]: I0310 15:08:48.914808 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:48 crc kubenswrapper[4743]: I0310 15:08:48.914919 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:48 crc kubenswrapper[4743]: I0310 15:08:48.914808 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:48 crc kubenswrapper[4743]: E0310 15:08:48.915202 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:48 crc kubenswrapper[4743]: E0310 15:08:48.915217 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:48 crc kubenswrapper[4743]: E0310 15:08:48.915442 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:49 crc kubenswrapper[4743]: I0310 15:08:49.915288 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:49 crc kubenswrapper[4743]: E0310 15:08:49.915624 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:50 crc kubenswrapper[4743]: I0310 15:08:50.914379 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:50 crc kubenswrapper[4743]: I0310 15:08:50.914443 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:50 crc kubenswrapper[4743]: I0310 15:08:50.914446 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:50 crc kubenswrapper[4743]: E0310 15:08:50.914579 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:50 crc kubenswrapper[4743]: E0310 15:08:50.914937 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:50 crc kubenswrapper[4743]: E0310 15:08:50.915023 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:51 crc kubenswrapper[4743]: E0310 15:08:51.379159 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:51 crc kubenswrapper[4743]: I0310 15:08:51.914671 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:51 crc kubenswrapper[4743]: E0310 15:08:51.915154 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:52 crc kubenswrapper[4743]: I0310 15:08:52.915036 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:52 crc kubenswrapper[4743]: I0310 15:08:52.915160 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:52 crc kubenswrapper[4743]: E0310 15:08:52.915595 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:52 crc kubenswrapper[4743]: E0310 15:08:52.915716 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:52 crc kubenswrapper[4743]: I0310 15:08:52.915198 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:52 crc kubenswrapper[4743]: E0310 15:08:52.915890 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:53 crc kubenswrapper[4743]: I0310 15:08:53.914910 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:53 crc kubenswrapper[4743]: E0310 15:08:53.915188 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:54 crc kubenswrapper[4743]: I0310 15:08:54.914455 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:54 crc kubenswrapper[4743]: I0310 15:08:54.914571 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:54 crc kubenswrapper[4743]: E0310 15:08:54.914692 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:54 crc kubenswrapper[4743]: E0310 15:08:54.914877 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:54 crc kubenswrapper[4743]: I0310 15:08:54.915129 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:54 crc kubenswrapper[4743]: E0310 15:08:54.915295 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:55 crc kubenswrapper[4743]: I0310 15:08:55.915482 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:55 crc kubenswrapper[4743]: E0310 15:08:55.917386 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.341913 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/1.log" Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.342590 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/0.log" Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.342650 4743 generic.go:334] "Generic (PLEG): container finished" podID="1736aae6-d840-4b31-8c44-6637a05f37ef" containerID="88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe" exitCode=1 Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.342695 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgbfn" event={"ID":"1736aae6-d840-4b31-8c44-6637a05f37ef","Type":"ContainerDied","Data":"88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe"} Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.342743 4743 scope.go:117] "RemoveContainer" containerID="403b36fa63f3444255df74ffe629ae6a1a9337fa5f574954fcea40fcc2a6733e" Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.343505 4743 scope.go:117] "RemoveContainer" containerID="88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe" Mar 10 15:08:56 crc kubenswrapper[4743]: E0310 15:08:56.343785 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-vgbfn_openshift-multus(1736aae6-d840-4b31-8c44-6637a05f37ef)\"" pod="openshift-multus/multus-vgbfn" podUID="1736aae6-d840-4b31-8c44-6637a05f37ef" Mar 10 15:08:56 crc kubenswrapper[4743]: E0310 15:08:56.382596 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.914633 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.914680 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:56 crc kubenswrapper[4743]: I0310 15:08:56.914654 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:56 crc kubenswrapper[4743]: E0310 15:08:56.914866 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:56 crc kubenswrapper[4743]: E0310 15:08:56.915043 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:56 crc kubenswrapper[4743]: E0310 15:08:56.915345 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:57 crc kubenswrapper[4743]: I0310 15:08:57.199140 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:08:57 crc kubenswrapper[4743]: I0310 15:08:57.199345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.199397 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:10:59.199353688 +0000 UTC m=+323.906168446 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.199530 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:08:57 crc kubenswrapper[4743]: I0310 15:08:57.199530 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.199630 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:10:59.199598915 +0000 UTC m=+323.906413833 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.199635 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.199932 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:10:59.199863833 +0000 UTC m=+323.906678631 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:08:57 crc kubenswrapper[4743]: I0310 15:08:57.301022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:57 crc kubenswrapper[4743]: I0310 15:08:57.301112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.301339 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.301369 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.301389 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.301402 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.301468 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.301474 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:10:59.301448515 +0000 UTC m=+324.008263293 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.301495 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.301593 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:10:59.301560108 +0000 UTC m=+324.008374896 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:08:57 crc kubenswrapper[4743]: I0310 15:08:57.348303 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/1.log" Mar 10 15:08:57 crc kubenswrapper[4743]: I0310 15:08:57.915451 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:57 crc kubenswrapper[4743]: E0310 15:08:57.915735 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:08:58 crc kubenswrapper[4743]: I0310 15:08:58.915219 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:58 crc kubenswrapper[4743]: I0310 15:08:58.915221 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:58 crc kubenswrapper[4743]: E0310 15:08:58.915446 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:58 crc kubenswrapper[4743]: I0310 15:08:58.915509 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:58 crc kubenswrapper[4743]: E0310 15:08:58.915665 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:58 crc kubenswrapper[4743]: E0310 15:08:58.915920 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:59 crc kubenswrapper[4743]: I0310 15:08:59.914500 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:08:59 crc kubenswrapper[4743]: E0310 15:08:59.914872 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:09:00 crc kubenswrapper[4743]: I0310 15:09:00.914734 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:00 crc kubenswrapper[4743]: I0310 15:09:00.914793 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:00 crc kubenswrapper[4743]: I0310 15:09:00.914877 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:00 crc kubenswrapper[4743]: E0310 15:09:00.914990 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:00 crc kubenswrapper[4743]: E0310 15:09:00.915338 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:00 crc kubenswrapper[4743]: E0310 15:09:00.915533 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:00 crc kubenswrapper[4743]: I0310 15:09:00.915662 4743 scope.go:117] "RemoveContainer" containerID="7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d" Mar 10 15:09:01 crc kubenswrapper[4743]: I0310 15:09:01.365903 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/3.log" Mar 10 15:09:01 crc kubenswrapper[4743]: I0310 15:09:01.369095 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerStarted","Data":"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95"} Mar 10 15:09:01 crc kubenswrapper[4743]: E0310 15:09:01.383456 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:01 crc kubenswrapper[4743]: I0310 15:09:01.401284 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podStartSLOduration=158.401260559 podStartE2EDuration="2m38.401260559s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:01.399321742 +0000 UTC m=+206.106136490" watchObservedRunningTime="2026-03-10 15:09:01.401260559 +0000 UTC m=+206.108075307" Mar 10 15:09:01 crc kubenswrapper[4743]: I0310 15:09:01.789084 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vcq2w"] Mar 10 15:09:01 crc kubenswrapper[4743]: I0310 15:09:01.789259 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:01 crc kubenswrapper[4743]: E0310 15:09:01.789402 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:09:02 crc kubenswrapper[4743]: I0310 15:09:02.915116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:02 crc kubenswrapper[4743]: I0310 15:09:02.915212 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:02 crc kubenswrapper[4743]: I0310 15:09:02.915231 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:02 crc kubenswrapper[4743]: E0310 15:09:02.915319 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:02 crc kubenswrapper[4743]: E0310 15:09:02.915477 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:02 crc kubenswrapper[4743]: E0310 15:09:02.915604 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:03 crc kubenswrapper[4743]: I0310 15:09:03.914455 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:03 crc kubenswrapper[4743]: E0310 15:09:03.914745 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:09:04 crc kubenswrapper[4743]: I0310 15:09:04.915350 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:04 crc kubenswrapper[4743]: I0310 15:09:04.915471 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:04 crc kubenswrapper[4743]: E0310 15:09:04.915569 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:04 crc kubenswrapper[4743]: I0310 15:09:04.915495 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:04 crc kubenswrapper[4743]: E0310 15:09:04.915765 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:04 crc kubenswrapper[4743]: E0310 15:09:04.915935 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:05 crc kubenswrapper[4743]: I0310 15:09:05.914685 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:05 crc kubenswrapper[4743]: E0310 15:09:05.916275 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:09:06 crc kubenswrapper[4743]: E0310 15:09:06.384598 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:06 crc kubenswrapper[4743]: I0310 15:09:06.915042 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:06 crc kubenswrapper[4743]: I0310 15:09:06.915127 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:06 crc kubenswrapper[4743]: E0310 15:09:06.915240 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:06 crc kubenswrapper[4743]: I0310 15:09:06.915319 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:06 crc kubenswrapper[4743]: E0310 15:09:06.915341 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:06 crc kubenswrapper[4743]: E0310 15:09:06.915505 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:07 crc kubenswrapper[4743]: I0310 15:09:07.914942 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:07 crc kubenswrapper[4743]: E0310 15:09:07.915113 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:09:08 crc kubenswrapper[4743]: I0310 15:09:08.914948 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:08 crc kubenswrapper[4743]: I0310 15:09:08.914997 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:08 crc kubenswrapper[4743]: I0310 15:09:08.914948 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:08 crc kubenswrapper[4743]: E0310 15:09:08.915206 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:08 crc kubenswrapper[4743]: I0310 15:09:08.915409 4743 scope.go:117] "RemoveContainer" containerID="88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe" Mar 10 15:09:08 crc kubenswrapper[4743]: E0310 15:09:08.915493 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:08 crc kubenswrapper[4743]: E0310 15:09:08.915768 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:09 crc kubenswrapper[4743]: I0310 15:09:09.399572 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/1.log" Mar 10 15:09:09 crc kubenswrapper[4743]: I0310 15:09:09.400092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgbfn" event={"ID":"1736aae6-d840-4b31-8c44-6637a05f37ef","Type":"ContainerStarted","Data":"c1aa041b4cfddd3a861e32679ddaa7411c71c86111d77caf2db634f7bcfbc8bb"} Mar 10 15:09:09 crc kubenswrapper[4743]: I0310 15:09:09.915183 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:09 crc kubenswrapper[4743]: E0310 15:09:09.915396 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcq2w" podUID="acbc8434-7aab-481b-ae0e-08696da082ad" Mar 10 15:09:10 crc kubenswrapper[4743]: I0310 15:09:10.915279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:10 crc kubenswrapper[4743]: I0310 15:09:10.915344 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:10 crc kubenswrapper[4743]: I0310 15:09:10.915982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:10 crc kubenswrapper[4743]: E0310 15:09:10.916120 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:10 crc kubenswrapper[4743]: E0310 15:09:10.916341 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:10 crc kubenswrapper[4743]: E0310 15:09:10.916625 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:11 crc kubenswrapper[4743]: I0310 15:09:11.602042 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:09:11 crc kubenswrapper[4743]: I0310 15:09:11.625526 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:09:11 crc kubenswrapper[4743]: I0310 15:09:11.914874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:11 crc kubenswrapper[4743]: I0310 15:09:11.917087 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 15:09:11 crc kubenswrapper[4743]: I0310 15:09:11.917779 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 15:09:12 crc kubenswrapper[4743]: I0310 15:09:12.915378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:12 crc kubenswrapper[4743]: I0310 15:09:12.915454 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:12 crc kubenswrapper[4743]: I0310 15:09:12.915382 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:12 crc kubenswrapper[4743]: I0310 15:09:12.919067 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 15:09:12 crc kubenswrapper[4743]: I0310 15:09:12.919079 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 15:09:12 crc kubenswrapper[4743]: I0310 15:09:12.919161 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 15:09:12 crc kubenswrapper[4743]: I0310 15:09:12.920078 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.218223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.265829 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-59rpz"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.266394 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-59rpz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.268108 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.268983 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.270225 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.271162 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.272199 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6gjk2"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.273174 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.285084 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.285225 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.285432 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.285710 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.285781 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.285952 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.286155 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.286259 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mlm2p"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.286388 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.286796 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.286931 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.286800 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.288569 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.288760 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.288784 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.288917 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.288990 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.289096 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.289158 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.289236 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.290349 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.290956 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.291435 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.292079 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.292322 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.292504 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.296342 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-v9bc6"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.297322 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2lf6"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.298012 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-97jtg"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.298708 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.298957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.298029 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.299928 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.298072 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.300541 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-658rk"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.306260 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cdt8j"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.307115 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.307933 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.310178 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.310091 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.311124 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.311620 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.324416 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-65l5t"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.325358 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4rxv"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.326236 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.327763 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.328163 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.331431 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.331853 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.310033 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.333276 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.345187 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.346359 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.346696 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347034 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347305 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347390 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347444 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347471 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347575 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347588 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347677 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347739 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347767 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347888 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347909 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347934 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347404 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348029 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.347694 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348087 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348250 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348314 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348421 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348490 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348533 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348679 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348725 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348772 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348866 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348879 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348957 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349010 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349031 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349110 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349126 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349187 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349233 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348260 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349386 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349497 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349506 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349627 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.348680 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349722 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.349951 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350003 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350061 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350128 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350146 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350196 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350235 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350300 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350326 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350376 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350061 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.350479 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.353886 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.354747 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.355142 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vfhfq"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.355734 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.355796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.358650 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.362073 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.363304 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.364676 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.365624 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.365867 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.366026 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.366063 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.366180 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.366289 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.366654 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.366926 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.367341 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.367414 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.367462 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.367471 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.367488 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.367594 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.367609 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.367907 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.368336 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.368905 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.375036 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mmmfs"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.376141 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.376266 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2d45q"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.376964 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.392637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7plq\" (UniqueName: \"kubernetes.io/projected/5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932-kube-api-access-s7plq\") pod \"dns-operator-744455d44c-w4rxv\" (UID: \"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.402564 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404330 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25bb44ff-d318-434a-82a7-0605d1fb57f2-serving-cert\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-config\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-audit\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404434 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-encryption-config\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gp2n\" (UniqueName: \"kubernetes.io/projected/d02c0611-45a7-4760-8187-4fd2b39f7dd4-kube-api-access-7gp2n\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-trusted-ca-bundle\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-config\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9572\" (UniqueName: \"kubernetes.io/projected/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-kube-api-access-g9572\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404572 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4hx9\" (UniqueName: \"kubernetes.io/projected/320568c9-bd4b-4ebd-a575-650bcdd5d104-kube-api-access-k4hx9\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-oauth-serving-cert\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404625 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25bb44ff-d318-434a-82a7-0605d1fb57f2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-config\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404687 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqxtj\" (UniqueName: \"kubernetes.io/projected/f1af669d-9f68-40aa-8789-3b8166784d40-kube-api-access-tqxtj\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-config\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-client\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404790 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404843 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/16986a56-efd6-49bb-9953-f6cf8d5b5e3d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wrn4x\" (UID: \"16986a56-efd6-49bb-9953-f6cf8d5b5e3d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404905 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404938 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46sl\" (UniqueName: \"kubernetes.io/projected/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-kube-api-access-f46sl\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404965 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.404995 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.405019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-serving-cert\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.405043 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-service-ca\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.405070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-etcd-serving-ca\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.405108 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42945\" (UniqueName: \"kubernetes.io/projected/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-kube-api-access-42945\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.405135 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdl7f\" (UniqueName: \"kubernetes.io/projected/5a2b935b-dc0d-4fec-9869-2a124ce4c274-kube-api-access-jdl7f\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.405236 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.405983 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-trusted-ca\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.434087 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.434799 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.437287 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.438417 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.438859 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nsqd8"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.439334 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.439709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.439891 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.440590 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.441630 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442177 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-image-import-ca\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-oauth-config\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/538f8801-691a-4332-a1c7-ae0d1b570198-signing-key\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-dir\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442532 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-serving-cert\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-policies\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442880 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2wvr\" (UniqueName: \"kubernetes.io/projected/52534b9a-5f54-4def-baf0-e5755c7e98d2-kube-api-access-c2wvr\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.442932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c0611-45a7-4760-8187-4fd2b39f7dd4-config\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443013 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443084 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4410bf70-11af-4388-9733-4d099fd8fff5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443218 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtdzt\" (UniqueName: \"kubernetes.io/projected/759dea92-ff95-4a61-8fa3-bb23f6306128-kube-api-access-dtdzt\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-etcd-client\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443332 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a2b935b-dc0d-4fec-9869-2a124ce4c274-audit-dir\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443355 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443399 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rndjn\" (UniqueName: \"kubernetes.io/projected/9164077f-fddc-43e6-9aac-23a8be818d9f-kube-api-access-rndjn\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443420 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc5x8\" (UniqueName: \"kubernetes.io/projected/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-kube-api-access-kc5x8\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932-metrics-tls\") pod \"dns-operator-744455d44c-w4rxv\" (UID: \"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443465 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9mtg\" (UniqueName: \"kubernetes.io/projected/2aed27d0-7067-4b00-bd27-07e71dbb0ff6-kube-api-access-w9mtg\") pod \"package-server-manager-789f6589d5-76lwx\" (UID: \"2aed27d0-7067-4b00-bd27-07e71dbb0ff6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtgq7\" (UniqueName: \"kubernetes.io/projected/ec0a0850-2f3c-4a27-a08c-0820a360ace9-kube-api-access-dtgq7\") pod \"downloads-7954f5f757-59rpz\" (UID: \"ec0a0850-2f3c-4a27-a08c-0820a360ace9\") " pod="openshift-console/downloads-7954f5f757-59rpz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443583 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbzl\" (UniqueName: \"kubernetes.io/projected/538f8801-691a-4332-a1c7-ae0d1b570198-kube-api-access-gsbzl\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443606 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443633 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-config\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959nr\" (UniqueName: \"kubernetes.io/projected/4410bf70-11af-4388-9733-4d099fd8fff5-kube-api-access-959nr\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443785 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d02c0611-45a7-4760-8187-4fd2b39f7dd4-images\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443838 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-service-ca-bundle\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443860 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/759dea92-ff95-4a61-8fa3-bb23f6306128-auth-proxy-config\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.443878 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-ca\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-config\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-audit-policies\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444668 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444688 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-encryption-config\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444716 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aed27d0-7067-4b00-bd27-07e71dbb0ff6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-76lwx\" (UID: \"2aed27d0-7067-4b00-bd27-07e71dbb0ff6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-client-ca\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.444982 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/759dea92-ff95-4a61-8fa3-bb23f6306128-machine-approver-tls\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1af669d-9f68-40aa-8789-3b8166784d40-serving-cert\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445037 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dljq\" (UniqueName: \"kubernetes.io/projected/16986a56-efd6-49bb-9953-f6cf8d5b5e3d-kube-api-access-4dljq\") pod \"cluster-samples-operator-665b6dd947-wrn4x\" (UID: \"16986a56-efd6-49bb-9953-f6cf8d5b5e3d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445053 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-config\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-serving-cert\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445089 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28rsb\" (UniqueName: \"kubernetes.io/projected/25bb44ff-d318-434a-82a7-0605d1fb57f2-kube-api-access-28rsb\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/538f8801-691a-4332-a1c7-ae0d1b570198-signing-cabundle\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445128 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-serving-cert\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/759dea92-ff95-4a61-8fa3-bb23f6306128-config\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/320568c9-bd4b-4ebd-a575-650bcdd5d104-node-pullsecrets\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-serving-cert\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-client-ca\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445224 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkczh\" (UniqueName: \"kubernetes.io/projected/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-kube-api-access-zkczh\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52534b9a-5f54-4def-baf0-e5755c7e98d2-serving-cert\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445259 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-trusted-ca-bundle\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chd4m\" (UniqueName: \"kubernetes.io/projected/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-kube-api-access-chd4m\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d02c0611-45a7-4760-8187-4fd2b39f7dd4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/320568c9-bd4b-4ebd-a575-650bcdd5d104-audit-dir\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445335 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4410bf70-11af-4388-9733-4d099fd8fff5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445354 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-service-ca\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445372 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-etcd-client\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.445388 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.448522 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.449628 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.451136 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.454655 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.454520 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.455496 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.455947 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.456117 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.458013 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.458707 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.460925 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.462656 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.465998 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552588-mtmtv"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.467181 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.469061 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.469161 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.469927 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.471277 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.471915 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.477055 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.477745 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l66d6"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.478067 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.478864 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.479195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.480302 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mxlth"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.480397 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.480921 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.482790 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6gjk2"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.483847 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.486065 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.487274 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zpv9r"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.487874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zpv9r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.489166 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.489722 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-97jtg"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.491402 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.493162 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.494067 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cdt8j"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.495290 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.497256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2d45q"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.498570 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mmmfs"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.499781 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.502843 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v9bc6"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.504745 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4rxv"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.506598 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2lf6"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.510287 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.511455 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.516398 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mlm2p"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.518155 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-65l5t"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.519413 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-658rk"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.522301 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.526659 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.528673 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.529426 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.533201 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.536639 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.536726 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.542476 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.542565 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vfhfq"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.544206 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.545259 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-config\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959nr\" (UniqueName: \"kubernetes.io/projected/4410bf70-11af-4388-9733-4d099fd8fff5-kube-api-access-959nr\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt4fn\" (UniqueName: \"kubernetes.io/projected/746f0851-522d-4354-9be2-0d370e2af3a2-kube-api-access-jt4fn\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d02c0611-45a7-4760-8187-4fd2b39f7dd4-images\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546327 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-default-certificate\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/759dea92-ff95-4a61-8fa3-bb23f6306128-auth-proxy-config\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546367 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-service-ca-bundle\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546429 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-audit-policies\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-config\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546937 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.546972 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57bc2216-e8f7-4056-afa8-1daa3daf04db-config\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aed27d0-7067-4b00-bd27-07e71dbb0ff6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-76lwx\" (UID: \"2aed27d0-7067-4b00-bd27-07e71dbb0ff6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547031 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-encryption-config\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547058 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547083 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746f0851-522d-4354-9be2-0d370e2af3a2-metrics-tls\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-client-ca\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-stats-auth\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvbmd\" (UniqueName: \"kubernetes.io/projected/9cbc1d80-bce4-4e58-873d-6723e2020c66-kube-api-access-pvbmd\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1af669d-9f68-40aa-8789-3b8166784d40-serving-cert\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/538f8801-691a-4332-a1c7-ae0d1b570198-signing-cabundle\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-serving-cert\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-serving-cert\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc1d80-bce4-4e58-873d-6723e2020c66-config\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/759dea92-ff95-4a61-8fa3-bb23f6306128-config\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547345 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/afc92a5c-a4ef-4ae8-9425-9787ea43ca0a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr4wq\" (UID: \"afc92a5c-a4ef-4ae8-9425-9787ea43ca0a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkczh\" (UniqueName: \"kubernetes.io/projected/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-kube-api-access-zkczh\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52534b9a-5f54-4def-baf0-e5755c7e98d2-serving-cert\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547428 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-trusted-ca-bundle\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547455 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chd4m\" (UniqueName: \"kubernetes.io/projected/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-kube-api-access-chd4m\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-etcd-client\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547510 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547871 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l66d6"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.547536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7plq\" (UniqueName: \"kubernetes.io/projected/5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932-kube-api-access-s7plq\") pod \"dns-operator-744455d44c-w4rxv\" (UID: \"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548132 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19999585-e753-49c3-b70d-c8b5a406a823-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548141 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-audit-policies\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548173 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-audit\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-trusted-ca-bundle\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548270 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-config\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548304 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-metrics-certs\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548332 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25bb44ff-d318-434a-82a7-0605d1fb57f2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19999585-e753-49c3-b70d-c8b5a406a823-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgfrk\" (UniqueName: \"kubernetes.io/projected/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-kube-api-access-kgfrk\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548470 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/746f0851-522d-4354-9be2-0d370e2af3a2-trusted-ca\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-client\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548545 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548575 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62xxd\" (UniqueName: \"kubernetes.io/projected/19999585-e753-49c3-b70d-c8b5a406a823-kube-api-access-62xxd\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548603 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-service-ca-bundle\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f46sl\" (UniqueName: \"kubernetes.io/projected/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-kube-api-access-f46sl\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548662 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-service-ca\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548675 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42945\" (UniqueName: \"kubernetes.io/projected/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-kube-api-access-42945\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548733 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdl7f\" (UniqueName: \"kubernetes.io/projected/5a2b935b-dc0d-4fec-9869-2a124ce4c274-kube-api-access-jdl7f\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548791 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-trusted-ca\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548849 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548879 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-image-import-ca\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548905 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbc1d80-bce4-4e58-873d-6723e2020c66-serving-cert\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548928 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/746f0851-522d-4354-9be2-0d370e2af3a2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-oauth-config\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548976 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-policies\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.548999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2wvr\" (UniqueName: \"kubernetes.io/projected/52534b9a-5f54-4def-baf0-e5755c7e98d2-kube-api-access-c2wvr\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.549020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.549019 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/759dea92-ff95-4a61-8fa3-bb23f6306128-auth-proxy-config\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.549082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4410bf70-11af-4388-9733-4d099fd8fff5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.549504 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.549869 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-service-ca-bundle\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.550258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d02c0611-45a7-4760-8187-4fd2b39f7dd4-images\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.550278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.550399 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-config\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.550727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1af669d-9f68-40aa-8789-3b8166784d40-config\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.551401 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-trusted-ca-bundle\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.551916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-service-ca\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.552158 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25bb44ff-d318-434a-82a7-0605d1fb57f2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.552791 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-config\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.553841 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-client-ca\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.553912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.554041 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mxlth"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.554082 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.554097 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wxcl8"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.555509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52534b9a-5f54-4def-baf0-e5755c7e98d2-serving-cert\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.555838 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5ps5r"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.556845 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-trusted-ca\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7wl5\" (UniqueName: \"kubernetes.io/projected/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-kube-api-access-f7wl5\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtdzt\" (UniqueName: \"kubernetes.io/projected/759dea92-ff95-4a61-8fa3-bb23f6306128-kube-api-access-dtdzt\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557104 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rndjn\" (UniqueName: \"kubernetes.io/projected/9164077f-fddc-43e6-9aac-23a8be818d9f-kube-api-access-rndjn\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557107 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-oauth-config\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc5x8\" (UniqueName: \"kubernetes.io/projected/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-kube-api-access-kc5x8\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557167 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-srv-cert\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557193 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-apiservice-cert\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtgq7\" (UniqueName: \"kubernetes.io/projected/ec0a0850-2f3c-4a27-a08c-0820a360ace9-kube-api-access-dtgq7\") pod \"downloads-7954f5f757-59rpz\" (UID: \"ec0a0850-2f3c-4a27-a08c-0820a360ace9\") " pod="openshift-console/downloads-7954f5f757-59rpz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xswl\" (UniqueName: \"kubernetes.io/projected/afc92a5c-a4ef-4ae8-9425-9787ea43ca0a-kube-api-access-4xswl\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr4wq\" (UID: \"afc92a5c-a4ef-4ae8-9425-9787ea43ca0a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557302 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbzl\" (UniqueName: \"kubernetes.io/projected/538f8801-691a-4332-a1c7-ae0d1b570198-kube-api-access-gsbzl\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557474 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.558753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.558790 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-audit\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.558856 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-mtmtv"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.557429 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.560275 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.560376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.560441 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/759dea92-ff95-4a61-8fa3-bb23f6306128-config\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.560561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-ca\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.560608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrh6v\" (UniqueName: \"kubernetes.io/projected/a65120a3-0d66-481a-9f09-9f338a85cbc4-kube-api-access-mrh6v\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.560855 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7t76\" (UniqueName: \"kubernetes.io/projected/92680623-686c-4b48-9fcd-b5e9d558a758-kube-api-access-b7t76\") pod \"multus-admission-controller-857f4d67dd-mmmfs\" (UID: \"92680623-686c-4b48-9fcd-b5e9d558a758\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.560976 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/759dea92-ff95-4a61-8fa3-bb23f6306128-machine-approver-tls\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57bc2216-e8f7-4056-afa8-1daa3daf04db-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561115 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-webhook-cert\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561203 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dljq\" (UniqueName: \"kubernetes.io/projected/16986a56-efd6-49bb-9953-f6cf8d5b5e3d-kube-api-access-4dljq\") pod \"cluster-samples-operator-665b6dd947-wrn4x\" (UID: \"16986a56-efd6-49bb-9953-f6cf8d5b5e3d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-config\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28rsb\" (UniqueName: \"kubernetes.io/projected/25bb44ff-d318-434a-82a7-0605d1fb57f2-kube-api-access-28rsb\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57bc2216-e8f7-4056-afa8-1daa3daf04db-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-ca\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/320568c9-bd4b-4ebd-a575-650bcdd5d104-node-pullsecrets\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-serving-cert\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561384 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f56xt\" (UniqueName: \"kubernetes.io/projected/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-kube-api-access-f56xt\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561430 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-client-ca\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d02c0611-45a7-4760-8187-4fd2b39f7dd4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-service-ca\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561490 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/320568c9-bd4b-4ebd-a575-650bcdd5d104-audit-dir\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561508 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4410bf70-11af-4388-9733-4d099fd8fff5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561529 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-images\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561537 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561603 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a65120a3-0d66-481a-9f09-9f338a85cbc4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561660 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25bb44ff-d318-434a-82a7-0605d1fb57f2-serving-cert\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561688 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-encryption-config\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561707 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/320568c9-bd4b-4ebd-a575-650bcdd5d104-node-pullsecrets\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.561708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gp2n\" (UniqueName: \"kubernetes.io/projected/d02c0611-45a7-4760-8187-4fd2b39f7dd4-kube-api-access-7gp2n\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.562419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.562533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.562570 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9572\" (UniqueName: \"kubernetes.io/projected/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-kube-api-access-g9572\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-service-ca\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.563499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4410bf70-11af-4388-9733-4d099fd8fff5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.563855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-config\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.563876 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/320568c9-bd4b-4ebd-a575-650bcdd5d104-audit-dir\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-client-ca\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-config\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4hx9\" (UniqueName: \"kubernetes.io/projected/320568c9-bd4b-4ebd-a575-650bcdd5d104-kube-api-access-k4hx9\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.562744 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zpv9r"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564639 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-oauth-serving-cert\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-config\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-config\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564794 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqxtj\" (UniqueName: \"kubernetes.io/projected/f1af669d-9f68-40aa-8789-3b8166784d40-kube-api-access-tqxtj\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564851 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlmfg\" (UniqueName: \"kubernetes.io/projected/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-kube-api-access-wlmfg\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564883 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm27d\" (UniqueName: \"kubernetes.io/projected/daf9e0f1-18ae-4197-b6b1-9a11439768b8-kube-api-access-hm27d\") pod \"migrator-59844c95c7-g9ssn\" (UID: \"daf9e0f1-18ae-4197-b6b1-9a11439768b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564918 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-proxy-tls\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564951 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.564984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/16986a56-efd6-49bb-9953-f6cf8d5b5e3d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wrn4x\" (UID: \"16986a56-efd6-49bb-9953-f6cf8d5b5e3d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565116 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-serving-cert\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-tmpfs\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-etcd-serving-ca\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565269 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a65120a3-0d66-481a-9f09-9f338a85cbc4-proxy-tls\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565309 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/538f8801-691a-4332-a1c7-ae0d1b570198-signing-key\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-dir\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c0611-45a7-4760-8187-4fd2b39f7dd4-config\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565460 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-serving-cert\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-oauth-serving-cert\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565551 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-etcd-client\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565577 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a2b935b-dc0d-4fec-9869-2a124ce4c274-audit-dir\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565606 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565614 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-config\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9mtg\" (UniqueName: \"kubernetes.io/projected/2aed27d0-7067-4b00-bd27-07e71dbb0ff6-kube-api-access-w9mtg\") pod \"package-server-manager-789f6589d5-76lwx\" (UID: \"2aed27d0-7067-4b00-bd27-07e71dbb0ff6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932-metrics-tls\") pod \"dns-operator-744455d44c-w4rxv\" (UID: \"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.565732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92680623-686c-4b48-9fcd-b5e9d558a758-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mmmfs\" (UID: \"92680623-686c-4b48-9fcd-b5e9d558a758\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.566929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.567353 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-config\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.567354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.567431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2b935b-dc0d-4fec-9869-2a124ce4c274-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.567528 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-dir\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.567658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-etcd-serving-ca\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.567788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-etcd-client\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.567868 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wxcl8"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.567905 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5ps5r"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.568011 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.568165 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/52534b9a-5f54-4def-baf0-e5755c7e98d2-etcd-client\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.568477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c0611-45a7-4760-8187-4fd2b39f7dd4-config\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.568538 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.568990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1af669d-9f68-40aa-8789-3b8166784d40-serving-cert\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.569847 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.570020 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-serving-cert\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.570285 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-serving-cert\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.570337 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a2b935b-dc0d-4fec-9869-2a124ce4c274-audit-dir\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.569858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4410bf70-11af-4388-9733-4d099fd8fff5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.570702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d02c0611-45a7-4760-8187-4fd2b39f7dd4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.570919 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-serving-cert\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.571476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-etcd-client\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.571587 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-serving-cert\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.571764 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-59rpz"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.572253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-serving-cert\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.572576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.573074 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.573247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932-metrics-tls\") pod \"dns-operator-744455d44c-w4rxv\" (UID: \"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.573350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25bb44ff-d318-434a-82a7-0605d1fb57f2-serving-cert\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.573673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.573899 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.574180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.574190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a2b935b-dc0d-4fec-9869-2a124ce4c274-encryption-config\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.574383 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5vnfn"] Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.574643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.575257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/759dea92-ff95-4a61-8fa3-bb23f6306128-machine-approver-tls\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.575493 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.575654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.576717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/320568c9-bd4b-4ebd-a575-650bcdd5d104-encryption-config\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.588285 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-policies\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.589130 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/16986a56-efd6-49bb-9953-f6cf8d5b5e3d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wrn4x\" (UID: \"16986a56-efd6-49bb-9953-f6cf8d5b5e3d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.589802 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.608844 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.617079 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-config\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.634348 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.643298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-trusted-ca-bundle\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.657837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/320568c9-bd4b-4ebd-a575-650bcdd5d104-image-import-ca\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.666732 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92680623-686c-4b48-9fcd-b5e9d558a758-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mmmfs\" (UID: \"92680623-686c-4b48-9fcd-b5e9d558a758\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.666798 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt4fn\" (UniqueName: \"kubernetes.io/projected/746f0851-522d-4354-9be2-0d370e2af3a2-kube-api-access-jt4fn\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.666844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-default-certificate\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.666881 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57bc2216-e8f7-4056-afa8-1daa3daf04db-config\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.666920 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746f0851-522d-4354-9be2-0d370e2af3a2-metrics-tls\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.666954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvbmd\" (UniqueName: \"kubernetes.io/projected/9cbc1d80-bce4-4e58-873d-6723e2020c66-kube-api-access-pvbmd\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.666981 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-stats-auth\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-config\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667030 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-socket-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc1d80-bce4-4e58-873d-6723e2020c66-config\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667080 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/afc92a5c-a4ef-4ae8-9425-9787ea43ca0a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr4wq\" (UID: \"afc92a5c-a4ef-4ae8-9425-9787ea43ca0a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-registration-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667167 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19999585-e753-49c3-b70d-c8b5a406a823-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667240 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-metrics-certs\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667295 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/746f0851-522d-4354-9be2-0d370e2af3a2-trusted-ca\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667361 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqfxn\" (UniqueName: \"kubernetes.io/projected/7837cec9-3686-497f-b9ec-2525768cd8ce-kube-api-access-lqfxn\") pod \"auto-csr-approver-29552588-mtmtv\" (UID: \"7837cec9-3686-497f-b9ec-2525768cd8ce\") " pod="openshift-infra/auto-csr-approver-29552588-mtmtv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19999585-e753-49c3-b70d-c8b5a406a823-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667413 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgfrk\" (UniqueName: \"kubernetes.io/projected/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-kube-api-access-kgfrk\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667439 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhbr\" (UniqueName: \"kubernetes.io/projected/01241a54-7e00-4d18-b407-4278620b7ac7-kube-api-access-dbhbr\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkzs8\" (UniqueName: \"kubernetes.io/projected/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-kube-api-access-kkzs8\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667508 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62xxd\" (UniqueName: \"kubernetes.io/projected/19999585-e753-49c3-b70d-c8b5a406a823-kube-api-access-62xxd\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667531 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-service-ca-bundle\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbc1d80-bce4-4e58-873d-6723e2020c66-serving-cert\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/746f0851-522d-4354-9be2-0d370e2af3a2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667830 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-plugins-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667859 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7wl5\" (UniqueName: \"kubernetes.io/projected/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-kube-api-access-f7wl5\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-srv-cert\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-apiservice-cert\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-csi-data-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.667994 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xswl\" (UniqueName: \"kubernetes.io/projected/afc92a5c-a4ef-4ae8-9425-9787ea43ca0a-kube-api-access-4xswl\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr4wq\" (UID: \"afc92a5c-a4ef-4ae8-9425-9787ea43ca0a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qd6\" (UniqueName: \"kubernetes.io/projected/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-kube-api-access-l7qd6\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrh6v\" (UniqueName: \"kubernetes.io/projected/a65120a3-0d66-481a-9f09-9f338a85cbc4-kube-api-access-mrh6v\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668127 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01241a54-7e00-4d18-b407-4278620b7ac7-config-volume\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668168 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-secret-volume\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7t76\" (UniqueName: \"kubernetes.io/projected/92680623-686c-4b48-9fcd-b5e9d558a758-kube-api-access-b7t76\") pod \"multus-admission-controller-857f4d67dd-mmmfs\" (UID: \"92680623-686c-4b48-9fcd-b5e9d558a758\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668539 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57bc2216-e8f7-4056-afa8-1daa3daf04db-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-webhook-cert\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668620 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01241a54-7e00-4d18-b407-4278620b7ac7-metrics-tls\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668705 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57bc2216-e8f7-4056-afa8-1daa3daf04db-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f56xt\" (UniqueName: \"kubernetes.io/projected/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-kube-api-access-f56xt\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.668787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-images\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669061 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a65120a3-0d66-481a-9f09-9f338a85cbc4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm27d\" (UniqueName: \"kubernetes.io/projected/daf9e0f1-18ae-4197-b6b1-9a11439768b8-kube-api-access-hm27d\") pod \"migrator-59844c95c7-g9ssn\" (UID: \"daf9e0f1-18ae-4197-b6b1-9a11439768b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-proxy-tls\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmfg\" (UniqueName: \"kubernetes.io/projected/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-kube-api-access-wlmfg\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-tmpfs\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a65120a3-0d66-481a-9f09-9f338a85cbc4-proxy-tls\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.669957 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-mountpoint-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.670334 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a65120a3-0d66-481a-9f09-9f338a85cbc4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.670582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.670622 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-tmpfs\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.688804 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.694531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aed27d0-7067-4b00-bd27-07e71dbb0ff6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-76lwx\" (UID: \"2aed27d0-7067-4b00-bd27-07e71dbb0ff6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.709420 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.728901 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.750267 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.769840 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.771579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-config\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.771652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-socket-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.771780 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.771846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-registration-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.771974 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqfxn\" (UniqueName: \"kubernetes.io/projected/7837cec9-3686-497f-b9ec-2525768cd8ce-kube-api-access-lqfxn\") pod \"auto-csr-approver-29552588-mtmtv\" (UID: \"7837cec9-3686-497f-b9ec-2525768cd8ce\") " pod="openshift-infra/auto-csr-approver-29552588-mtmtv" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772037 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhbr\" (UniqueName: \"kubernetes.io/projected/01241a54-7e00-4d18-b407-4278620b7ac7-kube-api-access-dbhbr\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772067 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkzs8\" (UniqueName: \"kubernetes.io/projected/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-kube-api-access-kkzs8\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-plugins-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-csi-data-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-registration-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qd6\" (UniqueName: \"kubernetes.io/projected/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-kube-api-access-l7qd6\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-plugins-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772481 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01241a54-7e00-4d18-b407-4278620b7ac7-config-volume\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-socket-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772516 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-secret-volume\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-csi-data-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772807 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01241a54-7e00-4d18-b407-4278620b7ac7-metrics-tls\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.772995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.773193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.773305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-mountpoint-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.773397 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-mountpoint-dir\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.789218 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.802171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/538f8801-691a-4332-a1c7-ae0d1b570198-signing-key\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.809919 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.819969 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/538f8801-691a-4332-a1c7-ae0d1b570198-signing-cabundle\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.829169 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.849565 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.862549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-apiservice-cert\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.863144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-webhook-cert\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.869727 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.881145 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92680623-686c-4b48-9fcd-b5e9d558a758-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mmmfs\" (UID: \"92680623-686c-4b48-9fcd-b5e9d558a758\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.889401 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.910126 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.929950 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.949555 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.962412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbc1d80-bce4-4e58-873d-6723e2020c66-serving-cert\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.967904 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.978577 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc1d80-bce4-4e58-873d-6723e2020c66-config\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:20 crc kubenswrapper[4743]: I0310 15:09:20.989308 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.008966 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.028981 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.048466 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.060496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19999585-e753-49c3-b70d-c8b5a406a823-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.069669 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.084072 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-metrics-certs\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.089029 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.109150 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.121316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-default-certificate\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.130140 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.144008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-stats-auth\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.149385 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.159774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-service-ca-bundle\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.169796 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.190324 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.209989 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.223094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746f0851-522d-4354-9be2-0d370e2af3a2-metrics-tls\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.236107 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.240597 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/746f0851-522d-4354-9be2-0d370e2af3a2-trusted-ca\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.248883 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.270636 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.288919 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.291757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19999585-e753-49c3-b70d-c8b5a406a823-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.308722 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.328132 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.350174 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.369323 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.390244 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.410462 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.420432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-images\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.430180 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.446008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-proxy-tls\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.449231 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.467149 4743 request.go:700] Waited for 1.011038359s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.469347 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.482442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/afc92a5c-a4ef-4ae8-9425-9787ea43ca0a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr4wq\" (UID: \"afc92a5c-a4ef-4ae8-9425-9787ea43ca0a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.489873 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.498645 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57bc2216-e8f7-4056-afa8-1daa3daf04db-config\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.510114 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.529165 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.543190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57bc2216-e8f7-4056-afa8-1daa3daf04db-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.549590 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.570115 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.589483 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.595965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-secret-volume\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.600762 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.609947 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.626472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-srv-cert\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.629946 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.648292 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.655405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a65120a3-0d66-481a-9f09-9f338a85cbc4-proxy-tls\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.669987 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.689325 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.709545 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.728588 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.748623 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.769885 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.772227 4743 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.772371 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-config podName:5ed481d4-44a9-41b0-a0f1-32360dc3cb85 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:22.27233789 +0000 UTC m=+226.979152668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-config") pod "kube-controller-manager-operator-78b949d7b-xbvz2" (UID: "5ed481d4-44a9-41b0-a0f1-32360dc3cb85") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.773240 4743 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.773324 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01241a54-7e00-4d18-b407-4278620b7ac7-config-volume podName:01241a54-7e00-4d18-b407-4278620b7ac7 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:22.273310969 +0000 UTC m=+226.980125727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/01241a54-7e00-4d18-b407-4278620b7ac7-config-volume") pod "dns-default-wxcl8" (UID: "01241a54-7e00-4d18-b407-4278620b7ac7") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.773411 4743 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.773439 4743 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.773480 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume podName:e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:22.273465453 +0000 UTC m=+226.980280211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume") pod "collect-profiles-29552580-l25l7" (UID: "e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.773499 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-serving-cert podName:5ed481d4-44a9-41b0-a0f1-32360dc3cb85 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:22.273491814 +0000 UTC m=+226.980306572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-serving-cert") pod "kube-controller-manager-operator-78b949d7b-xbvz2" (UID: "5ed481d4-44a9-41b0-a0f1-32360dc3cb85") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.773503 4743 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: E0310 15:09:21.773540 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01241a54-7e00-4d18-b407-4278620b7ac7-metrics-tls podName:01241a54-7e00-4d18-b407-4278620b7ac7 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:22.273528135 +0000 UTC m=+226.980342893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01241a54-7e00-4d18-b407-4278620b7ac7-metrics-tls") pod "dns-default-wxcl8" (UID: "01241a54-7e00-4d18-b407-4278620b7ac7") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.790453 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.808608 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.829548 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.849046 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.868030 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.898089 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.909574 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.929427 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.949534 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.969603 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 15:09:21 crc kubenswrapper[4743]: I0310 15:09:21.990078 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.010366 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.029667 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.049875 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.069449 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.089957 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.109230 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.129440 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.149471 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.196569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959nr\" (UniqueName: \"kubernetes.io/projected/4410bf70-11af-4388-9733-4d099fd8fff5-kube-api-access-959nr\") pod \"openshift-apiserver-operator-796bbdcf4f-ddlhj\" (UID: \"4410bf70-11af-4388-9733-4d099fd8fff5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.209666 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkczh\" (UniqueName: \"kubernetes.io/projected/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-kube-api-access-zkczh\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.230860 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46sl\" (UniqueName: \"kubernetes.io/projected/1bb47cb7-c74f-42ee-bffe-c83ea70fe119-kube-api-access-f46sl\") pod \"console-operator-58897d9998-65l5t\" (UID: \"1bb47cb7-c74f-42ee-bffe-c83ea70fe119\") " pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.251287 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chd4m\" (UniqueName: \"kubernetes.io/projected/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-kube-api-access-chd4m\") pod \"console-f9d7485db-v9bc6\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.263929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42945\" (UniqueName: \"kubernetes.io/projected/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-kube-api-access-42945\") pod \"controller-manager-879f6c89f-cdt8j\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.284929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdl7f\" (UniqueName: \"kubernetes.io/projected/5a2b935b-dc0d-4fec-9869-2a124ce4c274-kube-api-access-jdl7f\") pod \"apiserver-7bbb656c7d-hdwfr\" (UID: \"5a2b935b-dc0d-4fec-9869-2a124ce4c274\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.290850 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.301554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.301918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01241a54-7e00-4d18-b407-4278620b7ac7-config-volume\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.302017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01241a54-7e00-4d18-b407-4278620b7ac7-metrics-tls\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.302132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.302305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-config\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.303229 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.303704 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-config\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.306468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.308043 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7plq\" (UniqueName: \"kubernetes.io/projected/5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932-kube-api-access-s7plq\") pod \"dns-operator-744455d44c-w4rxv\" (UID: \"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.308885 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.313825 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01241a54-7e00-4d18-b407-4278620b7ac7-config-volume\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.319693 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.330056 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.330298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.345523 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.351053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.356664 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.357228 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01241a54-7e00-4d18-b407-4278620b7ac7-metrics-tls\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.385751 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rndjn\" (UniqueName: \"kubernetes.io/projected/9164077f-fddc-43e6-9aac-23a8be818d9f-kube-api-access-rndjn\") pod \"oauth-openshift-558db77b4-l2lf6\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.410200 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.410788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc5x8\" (UniqueName: \"kubernetes.io/projected/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-kube-api-access-kc5x8\") pod \"route-controller-manager-6576b87f9c-xczzg\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.413806 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.429806 4743 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.449281 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.449566 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.467947 4743 request.go:700] Waited for 1.907380423s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/default/token Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.491353 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtgq7\" (UniqueName: \"kubernetes.io/projected/ec0a0850-2f3c-4a27-a08c-0820a360ace9-kube-api-access-dtgq7\") pod \"downloads-7954f5f757-59rpz\" (UID: \"ec0a0850-2f3c-4a27-a08c-0820a360ace9\") " pod="openshift-console/downloads-7954f5f757-59rpz" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.517569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2wvr\" (UniqueName: \"kubernetes.io/projected/52534b9a-5f54-4def-baf0-e5755c7e98d2-kube-api-access-c2wvr\") pod \"etcd-operator-b45778765-97jtg\" (UID: \"52534b9a-5f54-4def-baf0-e5755c7e98d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.529004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtdzt\" (UniqueName: \"kubernetes.io/projected/759dea92-ff95-4a61-8fa3-bb23f6306128-kube-api-access-dtdzt\") pod \"machine-approver-56656f9798-5llhk\" (UID: \"759dea92-ff95-4a61-8fa3-bb23f6306128\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.536406 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.539081 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v9bc6"] Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.570767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbzl\" (UniqueName: \"kubernetes.io/projected/538f8801-691a-4332-a1c7-ae0d1b570198-kube-api-access-gsbzl\") pod \"service-ca-9c57cc56f-vfhfq\" (UID: \"538f8801-691a-4332-a1c7-ae0d1b570198\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.572056 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gp2n\" (UniqueName: \"kubernetes.io/projected/d02c0611-45a7-4760-8187-4fd2b39f7dd4-kube-api-access-7gp2n\") pod \"machine-api-operator-5694c8668f-mlm2p\" (UID: \"d02c0611-45a7-4760-8187-4fd2b39f7dd4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.574535 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.584757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28rsb\" (UniqueName: \"kubernetes.io/projected/25bb44ff-d318-434a-82a7-0605d1fb57f2-kube-api-access-28rsb\") pod \"openshift-config-operator-7777fb866f-qrqrp\" (UID: \"25bb44ff-d318-434a-82a7-0605d1fb57f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.607994 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.610535 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cdt8j"] Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.634581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9572\" (UniqueName: \"kubernetes.io/projected/bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd-kube-api-access-g9572\") pod \"openshift-controller-manager-operator-756b6f6bc6-b44lp\" (UID: \"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.646890 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dljq\" (UniqueName: \"kubernetes.io/projected/16986a56-efd6-49bb-9953-f6cf8d5b5e3d-kube-api-access-4dljq\") pod \"cluster-samples-operator-665b6dd947-wrn4x\" (UID: \"16986a56-efd6-49bb-9953-f6cf8d5b5e3d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.653866 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4hx9\" (UniqueName: \"kubernetes.io/projected/320568c9-bd4b-4ebd-a575-650bcdd5d104-kube-api-access-k4hx9\") pod \"apiserver-76f77b778f-658rk\" (UID: \"320568c9-bd4b-4ebd-a575-650bcdd5d104\") " pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.666370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqxtj\" (UniqueName: \"kubernetes.io/projected/f1af669d-9f68-40aa-8789-3b8166784d40-kube-api-access-tqxtj\") pod \"authentication-operator-69f744f599-6gjk2\" (UID: \"f1af669d-9f68-40aa-8789-3b8166784d40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.684524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9mtg\" (UniqueName: \"kubernetes.io/projected/2aed27d0-7067-4b00-bd27-07e71dbb0ff6-kube-api-access-w9mtg\") pod \"package-server-manager-789f6589d5-76lwx\" (UID: \"2aed27d0-7067-4b00-bd27-07e71dbb0ff6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:22 crc kubenswrapper[4743]: W0310 15:09:22.685856 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4370e3f8_d9d3_48c8_a10f_d19c28342bb6.slice/crio-8f32dc78fc8fd0289edd870ce561d3542ca62cd3f1e46c12b7e520bac61b7f78 WatchSource:0}: Error finding container 8f32dc78fc8fd0289edd870ce561d3542ca62cd3f1e46c12b7e520bac61b7f78: Status 404 returned error can't find the container with id 8f32dc78fc8fd0289edd870ce561d3542ca62cd3f1e46c12b7e520bac61b7f78 Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.702456 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-59rpz" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.704302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd2b8b7d-a9b2-4299-8253-7c71ab4e0619-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhw4m\" (UID: \"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.722288 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.729859 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.733182 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.739053 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.750236 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.756869 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.764310 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.772533 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.772796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.815528 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj"] Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.817777 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr"] Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.828534 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt4fn\" (UniqueName: \"kubernetes.io/projected/746f0851-522d-4354-9be2-0d370e2af3a2-kube-api-access-jt4fn\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.841782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.860002 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.865515 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.881452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvbmd\" (UniqueName: \"kubernetes.io/projected/9cbc1d80-bce4-4e58-873d-6723e2020c66-kube-api-access-pvbmd\") pod \"service-ca-operator-777779d784-2d45q\" (UID: \"9cbc1d80-bce4-4e58-873d-6723e2020c66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.881949 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgfrk\" (UniqueName: \"kubernetes.io/projected/edc06df9-1e96-4bb1-892a-d9fa1bcd6341-kube-api-access-kgfrk\") pod \"machine-config-operator-74547568cd-9tw2t\" (UID: \"edc06df9-1e96-4bb1-892a-d9fa1bcd6341\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.889793 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xswl\" (UniqueName: \"kubernetes.io/projected/afc92a5c-a4ef-4ae8-9425-9787ea43ca0a-kube-api-access-4xswl\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr4wq\" (UID: \"afc92a5c-a4ef-4ae8-9425-9787ea43ca0a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.935345 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/746f0851-522d-4354-9be2-0d370e2af3a2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lrgf6\" (UID: \"746f0851-522d-4354-9be2-0d370e2af3a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.946442 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-97jtg"] Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.950086 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-65l5t"] Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.951249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrh6v\" (UniqueName: \"kubernetes.io/projected/a65120a3-0d66-481a-9f09-9f338a85cbc4-kube-api-access-mrh6v\") pod \"machine-config-controller-84d6567774-jr75n\" (UID: \"a65120a3-0d66-481a-9f09-9f338a85cbc4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.951420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62xxd\" (UniqueName: \"kubernetes.io/projected/19999585-e753-49c3-b70d-c8b5a406a823-kube-api-access-62xxd\") pod \"kube-storage-version-migrator-operator-b67b599dd-lvzgw\" (UID: \"19999585-e753-49c3-b70d-c8b5a406a823\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.966629 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4rxv"] Mar 10 15:09:22 crc kubenswrapper[4743]: I0310 15:09:22.988877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7t76\" (UniqueName: \"kubernetes.io/projected/92680623-686c-4b48-9fcd-b5e9d558a758-kube-api-access-b7t76\") pod \"multus-admission-controller-857f4d67dd-mmmfs\" (UID: \"92680623-686c-4b48-9fcd-b5e9d558a758\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.007517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f56xt\" (UniqueName: \"kubernetes.io/projected/7ef8002a-10d8-4758-a110-7b31fd2fe9e1-kube-api-access-f56xt\") pod \"olm-operator-6b444d44fb-66vbz\" (UID: \"7ef8002a-10d8-4758-a110-7b31fd2fe9e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.008690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7wl5\" (UniqueName: \"kubernetes.io/projected/1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48-kube-api-access-f7wl5\") pod \"router-default-5444994796-nsqd8\" (UID: \"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48\") " pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:23 crc kubenswrapper[4743]: W0310 15:09:23.022125 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ddaf9f3_b2a3_4a5c_9514_1e3c47d47932.slice/crio-1595b4b57c5ecc88b4127fd8dea1d532f28d8dbb21e8bb2b03ba20d4203adb7f WatchSource:0}: Error finding container 1595b4b57c5ecc88b4127fd8dea1d532f28d8dbb21e8bb2b03ba20d4203adb7f: Status 404 returned error can't find the container with id 1595b4b57c5ecc88b4127fd8dea1d532f28d8dbb21e8bb2b03ba20d4203adb7f Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.024012 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.026848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57bc2216-e8f7-4056-afa8-1daa3daf04db-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-48kwm\" (UID: \"57bc2216-e8f7-4056-afa8-1daa3daf04db\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.049640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm27d\" (UniqueName: \"kubernetes.io/projected/daf9e0f1-18ae-4197-b6b1-9a11439768b8-kube-api-access-hm27d\") pod \"migrator-59844c95c7-g9ssn\" (UID: \"daf9e0f1-18ae-4197-b6b1-9a11439768b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.064591 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmfg\" (UniqueName: \"kubernetes.io/projected/bbed94ba-a259-4809-8b9e-72a5a26eb3b1-kube-api-access-wlmfg\") pod \"packageserver-d55dfcdfc-9pdwr\" (UID: \"bbed94ba-a259-4809-8b9e-72a5a26eb3b1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.076839 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.083687 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.085439 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqfxn\" (UniqueName: \"kubernetes.io/projected/7837cec9-3686-497f-b9ec-2525768cd8ce-kube-api-access-lqfxn\") pod \"auto-csr-approver-29552588-mtmtv\" (UID: \"7837cec9-3686-497f-b9ec-2525768cd8ce\") " pod="openshift-infra/auto-csr-approver-29552588-mtmtv" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.096263 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.109145 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.115512 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkzs8\" (UniqueName: \"kubernetes.io/projected/3c0b5287-3555-4c8f-a6cc-7e689b3046e1-kube-api-access-kkzs8\") pod \"csi-hostpathplugin-5ps5r\" (UID: \"3c0b5287-3555-4c8f-a6cc-7e689b3046e1\") " pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.128042 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.136619 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.140450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhbr\" (UniqueName: \"kubernetes.io/projected/01241a54-7e00-4d18-b407-4278620b7ac7-kube-api-access-dbhbr\") pod \"dns-default-wxcl8\" (UID: \"01241a54-7e00-4d18-b407-4278620b7ac7\") " pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.140826 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.148398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.161759 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.163171 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2lf6"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.167362 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qd6\" (UniqueName: \"kubernetes.io/projected/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-kube-api-access-l7qd6\") pod \"collect-profiles-29552580-l25l7\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.170505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ed481d4-44a9-41b0-a0f1-32360dc3cb85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xbvz2\" (UID: \"5ed481d4-44a9-41b0-a0f1-32360dc3cb85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.184921 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.194431 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.204429 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.208859 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-59rpz"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.231535 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-658rk"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.260069 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c73693-6f88-4967-922a-7d0521a41343-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261277 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/199e5a98-b472-45af-9088-ffe163ceba78-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl64j\" (UniqueName: \"kubernetes.io/projected/115e302d-a5bc-4198-a214-7aeb0e74f1cd-kube-api-access-dl64j\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261438 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54c73693-6f88-4967-922a-7d0521a41343-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-bound-sa-token\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54c73693-6f88-4967-922a-7d0521a41343-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5-cert\") pod \"ingress-canary-zpv9r\" (UID: \"47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5\") " pod="openshift-ingress-canary/ingress-canary-zpv9r" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261555 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-registry-tls\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pbnj\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-kube-api-access-4pbnj\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261650 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-trusted-ca\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/115e302d-a5bc-4198-a214-7aeb0e74f1cd-srv-cert\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvkf\" (UniqueName: \"kubernetes.io/projected/ba41eb29-8687-44ad-8001-642d0ff1fd7f-kube-api-access-2pvkf\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261794 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74kx9\" (UniqueName: \"kubernetes.io/projected/47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5-kube-api-access-74kx9\") pod \"ingress-canary-zpv9r\" (UID: \"47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5\") " pod="openshift-ingress-canary/ingress-canary-zpv9r" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261829 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/115e302d-a5bc-4198-a214-7aeb0e74f1cd-profile-collector-cert\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261865 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/199e5a98-b472-45af-9088-ffe163ceba78-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.261884 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-registry-certificates\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.262011 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:23.761987682 +0000 UTC m=+228.468802590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.281526 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.292335 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.301118 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.326037 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.341631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.362463 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.362633 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:23.862603285 +0000 UTC m=+228.569418033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.362753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/199e5a98-b472-45af-9088-ffe163ceba78-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.362827 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-registry-certificates\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.362878 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wddql\" (UniqueName: \"kubernetes.io/projected/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-kube-api-access-wddql\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.362973 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-node-bootstrap-token\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.362997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c73693-6f88-4967-922a-7d0521a41343-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/199e5a98-b472-45af-9088-ffe163ceba78-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl64j\" (UniqueName: \"kubernetes.io/projected/115e302d-a5bc-4198-a214-7aeb0e74f1cd-kube-api-access-dl64j\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363325 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54c73693-6f88-4967-922a-7d0521a41343-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363430 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-bound-sa-token\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54c73693-6f88-4967-922a-7d0521a41343-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363469 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5-cert\") pod \"ingress-canary-zpv9r\" (UID: \"47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5\") " pod="openshift-ingress-canary/ingress-canary-zpv9r" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-registry-tls\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pbnj\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-kube-api-access-4pbnj\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363619 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-certs\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-trusted-ca\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.364184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c73693-6f88-4967-922a-7d0521a41343-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.365696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.366907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/199e5a98-b472-45af-9088-ffe163ceba78-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.368680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-registry-certificates\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.369377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-trusted-ca\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.370356 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/199e5a98-b472-45af-9088-ffe163ceba78-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.371630 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-registry-tls\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.363913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.374372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5-cert\") pod \"ingress-canary-zpv9r\" (UID: \"47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5\") " pod="openshift-ingress-canary/ingress-canary-zpv9r" Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.370248 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:23.870037982 +0000 UTC m=+228.576852720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.377182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/115e302d-a5bc-4198-a214-7aeb0e74f1cd-srv-cert\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.377250 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvkf\" (UniqueName: \"kubernetes.io/projected/ba41eb29-8687-44ad-8001-642d0ff1fd7f-kube-api-access-2pvkf\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.377274 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74kx9\" (UniqueName: \"kubernetes.io/projected/47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5-kube-api-access-74kx9\") pod \"ingress-canary-zpv9r\" (UID: \"47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5\") " pod="openshift-ingress-canary/ingress-canary-zpv9r" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.377314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/115e302d-a5bc-4198-a214-7aeb0e74f1cd-profile-collector-cert\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: W0310 15:09:23.376404 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1da9c1ba_ea6b_41e2_9e9b_8c7876ca7c48.slice/crio-031216b9cee2a0926ff7284a35374fa95eec5954a980fb2db82620fcad8c59d6 WatchSource:0}: Error finding container 031216b9cee2a0926ff7284a35374fa95eec5954a980fb2db82620fcad8c59d6: Status 404 returned error can't find the container with id 031216b9cee2a0926ff7284a35374fa95eec5954a980fb2db82620fcad8c59d6 Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.378267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.381279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54c73693-6f88-4967-922a-7d0521a41343-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.384610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/115e302d-a5bc-4198-a214-7aeb0e74f1cd-srv-cert\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.385540 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/115e302d-a5bc-4198-a214-7aeb0e74f1cd-profile-collector-cert\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.406739 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl64j\" (UniqueName: \"kubernetes.io/projected/115e302d-a5bc-4198-a214-7aeb0e74f1cd-kube-api-access-dl64j\") pod \"catalog-operator-68c6474976-hhfsf\" (UID: \"115e302d-a5bc-4198-a214-7aeb0e74f1cd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.427264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pbnj\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-kube-api-access-4pbnj\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.449440 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54c73693-6f88-4967-922a-7d0521a41343-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tkzf4\" (UID: \"54c73693-6f88-4967-922a-7d0521a41343\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: W0310 15:09:23.462385 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25bb44ff_d318_434a_82a7_0605d1fb57f2.slice/crio-2587d79e6fd4ea5188eb081e84cd341bb75042ca4e5c897664976aeb8be8987d WatchSource:0}: Error finding container 2587d79e6fd4ea5188eb081e84cd341bb75042ca4e5c897664976aeb8be8987d: Status 404 returned error can't find the container with id 2587d79e6fd4ea5188eb081e84cd341bb75042ca4e5c897664976aeb8be8987d Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.467846 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.473674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-bound-sa-token\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.476908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" event={"ID":"5a2b935b-dc0d-4fec-9869-2a124ce4c274","Type":"ContainerStarted","Data":"f63abc74f47c6f40ce997dd464c6e7845fc67fbb5e1838d4a2f81d865730f7a5"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.478495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.479836 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:23.979795992 +0000 UTC m=+228.686610740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.479887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wddql\" (UniqueName: \"kubernetes.io/projected/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-kube-api-access-wddql\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.479936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-node-bootstrap-token\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.480032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.480076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-certs\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.483239 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:23.983219392 +0000 UTC m=+228.690034140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.486633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvkf\" (UniqueName: \"kubernetes.io/projected/ba41eb29-8687-44ad-8001-642d0ff1fd7f-kube-api-access-2pvkf\") pod \"marketplace-operator-79b997595-l66d6\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.486988 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-node-bootstrap-token\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.488614 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-certs\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.498315 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-65l5t" event={"ID":"1bb47cb7-c74f-42ee-bffe-c83ea70fe119","Type":"ContainerStarted","Data":"3c0667513e1f438e687defbc8cf3f2c066f79b90494371f4dad36d7e3b127cfe"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.498393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-65l5t" event={"ID":"1bb47cb7-c74f-42ee-bffe-c83ea70fe119","Type":"ContainerStarted","Data":"6fbcbac5975408c030ed8eaaad2a2928c7c8d3598c61dfaf4a4df60a3d8b696d"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.498990 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.499442 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.504189 4743 patch_prober.go:28] interesting pod/console-operator-58897d9998-65l5t container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.504280 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-65l5t" podUID="1bb47cb7-c74f-42ee-bffe-c83ea70fe119" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.506936 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.531482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74kx9\" (UniqueName: \"kubernetes.io/projected/47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5-kube-api-access-74kx9\") pod \"ingress-canary-zpv9r\" (UID: \"47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5\") " pod="openshift-ingress-canary/ingress-canary-zpv9r" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.537930 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.554337 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vfhfq"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.558016 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.561535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wddql\" (UniqueName: \"kubernetes.io/projected/cba6c5c0-919b-488d-aef0-9cc4ebbf983b-kube-api-access-wddql\") pod \"machine-config-server-5vnfn\" (UID: \"cba6c5c0-919b-488d-aef0-9cc4ebbf983b\") " pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.563869 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6gjk2"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.566271 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mlm2p"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.566931 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.567271 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.572645 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" event={"ID":"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932","Type":"ContainerStarted","Data":"1595b4b57c5ecc88b4127fd8dea1d532f28d8dbb21e8bb2b03ba20d4203adb7f"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.573640 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.583610 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.584082 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.084059762 +0000 UTC m=+228.790874510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.586465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-59rpz" event={"ID":"ec0a0850-2f3c-4a27-a08c-0820a360ace9","Type":"ContainerStarted","Data":"fb1ede812f5bc1d5880e2b8c246676d60882e1a025a6eb7760cc9b158151d8de"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.605552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" event={"ID":"759dea92-ff95-4a61-8fa3-bb23f6306128","Type":"ContainerStarted","Data":"9e8559f45ad6c14c881a7dce216b827e02b0f2195e9ffe942510fa0a8d3cf442"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.605598 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" event={"ID":"759dea92-ff95-4a61-8fa3-bb23f6306128","Type":"ContainerStarted","Data":"b470cf1a70c436534139ebf806f09287aa747def879e72de08b07699880cdfaf"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.610086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" event={"ID":"a257bbc1-f866-4d43-9011-7ed9bd6d13e9","Type":"ContainerStarted","Data":"d9523735b51b58083c873df5af265bd1a0e8cd77e443c7477db82f7460b09e33"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.613636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" event={"ID":"4410bf70-11af-4388-9733-4d099fd8fff5","Type":"ContainerStarted","Data":"a3fa3c7ebc527f8abe94907c5a8662ca78eee6ca2d08ff857be9def7a65d7654"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.613684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" event={"ID":"4410bf70-11af-4388-9733-4d099fd8fff5","Type":"ContainerStarted","Data":"cd63a903b947d51fdda8671600f082724790eef73fdb93fca61eb881652ed66a"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.613880 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zpv9r" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.620215 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" event={"ID":"4370e3f8-d9d3-48c8-a10f-d19c28342bb6","Type":"ContainerStarted","Data":"2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.620261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" event={"ID":"4370e3f8-d9d3-48c8-a10f-d19c28342bb6","Type":"ContainerStarted","Data":"8f32dc78fc8fd0289edd870ce561d3542ca62cd3f1e46c12b7e520bac61b7f78"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.620477 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.622052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-658rk" event={"ID":"320568c9-bd4b-4ebd-a575-650bcdd5d104","Type":"ContainerStarted","Data":"b4dd4e8614802f496eae1242d72bd50c39b340513d42901f5e33da60a8bb7ab3"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.622744 4743 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cdt8j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.622805 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" podUID="4370e3f8-d9d3-48c8-a10f-d19c28342bb6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.623847 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nsqd8" event={"ID":"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48","Type":"ContainerStarted","Data":"031216b9cee2a0926ff7284a35374fa95eec5954a980fb2db82620fcad8c59d6"} Mar 10 15:09:23 crc kubenswrapper[4743]: W0310 15:09:23.626264 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2b8b7d_a9b2_4299_8253_7c71ab4e0619.slice/crio-3eff1f36b70f55512a2fe71227dd8439dcf771f75516c14544d1d6a3d9ece69d WatchSource:0}: Error finding container 3eff1f36b70f55512a2fe71227dd8439dcf771f75516c14544d1d6a3d9ece69d: Status 404 returned error can't find the container with id 3eff1f36b70f55512a2fe71227dd8439dcf771f75516c14544d1d6a3d9ece69d Mar 10 15:09:23 crc kubenswrapper[4743]: W0310 15:09:23.628524 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod538f8801_691a_4332_a1c7_ae0d1b570198.slice/crio-ee56bda5bbddb550ac37bcd37ab6a4726b29dc4a084fead7a59e8b62db58a9d2 WatchSource:0}: Error finding container ee56bda5bbddb550ac37bcd37ab6a4726b29dc4a084fead7a59e8b62db58a9d2: Status 404 returned error can't find the container with id ee56bda5bbddb550ac37bcd37ab6a4726b29dc4a084fead7a59e8b62db58a9d2 Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.629093 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" event={"ID":"52534b9a-5f54-4def-baf0-e5755c7e98d2","Type":"ContainerStarted","Data":"393b872e1fc4beb82afe41980234ac8cca9083e5f70c383b123fc6e8e8c146a6"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.632128 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v9bc6" event={"ID":"bc7402d9-c20f-4429-bda9-db2b1ccddf8e","Type":"ContainerStarted","Data":"b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.632152 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v9bc6" event={"ID":"bc7402d9-c20f-4429-bda9-db2b1ccddf8e","Type":"ContainerStarted","Data":"ceb2c78694f440ee9a33dd83374acb9173a88e5e0e9b538ca0b58dae64f1f974"} Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.633743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" event={"ID":"9164077f-fddc-43e6-9aac-23a8be818d9f","Type":"ContainerStarted","Data":"4d2e9781747ecc9bf22a968b648fc31eb1dbd2e209ee5ab625974d241775daee"} Mar 10 15:09:23 crc kubenswrapper[4743]: W0310 15:09:23.653573 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1af669d_9f68_40aa_8789_3b8166784d40.slice/crio-c1b78444aa41d0f257b9e41e5735241900e244ec065c3ba0484d886b4b27798e WatchSource:0}: Error finding container c1b78444aa41d0f257b9e41e5735241900e244ec065c3ba0484d886b4b27798e: Status 404 returned error can't find the container with id c1b78444aa41d0f257b9e41e5735241900e244ec065c3ba0484d886b4b27798e Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.655668 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mmmfs"] Mar 10 15:09:23 crc kubenswrapper[4743]: W0310 15:09:23.659985 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5b17f3_18e2_4bf5_a48f_d752c54fc4dd.slice/crio-de985adfa35f61ad8b743f01b8c2b6c1aa0b5f5c9c10be9db7aa6c2de1929ea3 WatchSource:0}: Error finding container de985adfa35f61ad8b743f01b8c2b6c1aa0b5f5c9c10be9db7aa6c2de1929ea3: Status 404 returned error can't find the container with id de985adfa35f61ad8b743f01b8c2b6c1aa0b5f5c9c10be9db7aa6c2de1929ea3 Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.662939 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5vnfn" Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.686469 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.687876 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.187863089 +0000 UTC m=+228.894677837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.722005 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.788022 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.791558 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.291532662 +0000 UTC m=+228.998347410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.802272 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2d45q"] Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.889695 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.890208 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.390184308 +0000 UTC m=+229.096999056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.990777 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.991307 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.491283126 +0000 UTC m=+229.198097874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:23 crc kubenswrapper[4743]: I0310 15:09:23.991412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:23 crc kubenswrapper[4743]: E0310 15:09:23.991999 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.491984206 +0000 UTC m=+229.198798954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.095020 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.095366 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.595326849 +0000 UTC m=+229.302141597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.097451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.098088 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.598073239 +0000 UTC m=+229.304887987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.167589 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.199841 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.200090 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.700050972 +0000 UTC m=+229.406865720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.200264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.200746 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.700736632 +0000 UTC m=+229.407551370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.289400 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-v9bc6" podStartSLOduration=181.289382877 podStartE2EDuration="3m1.289382877s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:24.288016617 +0000 UTC m=+228.994831365" watchObservedRunningTime="2026-03-10 15:09:24.289382877 +0000 UTC m=+228.996197625" Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.302471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.303262 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.803053856 +0000 UTC m=+229.509868604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.407957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.408489 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:24.908471769 +0000 UTC m=+229.615286517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.424977 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.451699 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.503992 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.519197 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.519517 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.019495756 +0000 UTC m=+229.726310504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.620626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.621195 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.121174751 +0000 UTC m=+229.827989499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.642685 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddlhj" podStartSLOduration=181.642657797 podStartE2EDuration="3m1.642657797s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:24.616187115 +0000 UTC m=+229.323001853" watchObservedRunningTime="2026-03-10 15:09:24.642657797 +0000 UTC m=+229.349472545" Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.644789 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5ps5r"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.667282 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.714517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" event={"ID":"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd","Type":"ContainerStarted","Data":"de985adfa35f61ad8b743f01b8c2b6c1aa0b5f5c9c10be9db7aa6c2de1929ea3"} Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.724952 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" event={"ID":"9cbc1d80-bce4-4e58-873d-6723e2020c66","Type":"ContainerStarted","Data":"305b627043e33e3278f980171ca136fe27dbe3d901297f3cb650c35b4d32fbb3"} Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.730523 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.731095 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.231074275 +0000 UTC m=+229.937889033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.756519 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.779089 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-mtmtv"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.793066 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wxcl8"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.813002 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" event={"ID":"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619","Type":"ContainerStarted","Data":"3eff1f36b70f55512a2fe71227dd8439dcf771f75516c14544d1d6a3d9ece69d"} Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.850527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.850852 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.350841467 +0000 UTC m=+230.057656215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.854034 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5vnfn" event={"ID":"cba6c5c0-919b-488d-aef0-9cc4ebbf983b","Type":"ContainerStarted","Data":"3c2fcfd0fe0ae75208a3bd87f7c6fdc2827c2d67853ea621ea8cb2fbbefcd6ca"} Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.868490 4743 generic.go:334] "Generic (PLEG): container finished" podID="5a2b935b-dc0d-4fec-9869-2a124ce4c274" containerID="0d96ce7af12d65a1b2cff4063fb156dddd52a164b11263ed0ce16515aab6453d" exitCode=0 Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.868612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" event={"ID":"5a2b935b-dc0d-4fec-9869-2a124ce4c274","Type":"ContainerDied","Data":"0d96ce7af12d65a1b2cff4063fb156dddd52a164b11263ed0ce16515aab6453d"} Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.877248 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.878732 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-65l5t" podStartSLOduration=181.87871823 podStartE2EDuration="3m1.87871823s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:24.877509895 +0000 UTC m=+229.584324643" watchObservedRunningTime="2026-03-10 15:09:24.87871823 +0000 UTC m=+229.585532978" Mar 10 15:09:24 crc kubenswrapper[4743]: W0310 15:09:24.887843 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda65120a3_0d66_481a_9f09_9f338a85cbc4.slice/crio-0cb45b1cefae4861a37939e4704f5005e2dbab39353f10cb4dd40ea5bee8eaf1 WatchSource:0}: Error finding container 0cb45b1cefae4861a37939e4704f5005e2dbab39353f10cb4dd40ea5bee8eaf1: Status 404 returned error can't find the container with id 0cb45b1cefae4861a37939e4704f5005e2dbab39353f10cb4dd40ea5bee8eaf1 Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.891277 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.926681 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" podStartSLOduration=181.926655307 podStartE2EDuration="3m1.926655307s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:24.917718957 +0000 UTC m=+229.624533705" watchObservedRunningTime="2026-03-10 15:09:24.926655307 +0000 UTC m=+229.633470055" Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.928881 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn"] Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.930831 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" event={"ID":"57bc2216-e8f7-4056-afa8-1daa3daf04db","Type":"ContainerStarted","Data":"137ee04e6af4c90a78b6935fc6f6985da3fad220589ebd0460d947dbae1315ca"} Mar 10 15:09:24 crc kubenswrapper[4743]: I0310 15:09:24.951323 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:24 crc kubenswrapper[4743]: E0310 15:09:24.952629 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.452591904 +0000 UTC m=+230.159406652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.008018 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l66d6"] Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.013863 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" event={"ID":"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932","Type":"ContainerStarted","Data":"f7e5546e9507242ddcfbfe9de954d3d800269ead2b528349959d3634319ac35a"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.042264 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.053536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.054147 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.554130504 +0000 UTC m=+230.260945252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.065170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" event={"ID":"f1af669d-9f68-40aa-8789-3b8166784d40","Type":"ContainerStarted","Data":"02cdb30489b53cf331a4bc3da0d3fd203604d8d91845e63836ddf1a68187e341"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.065226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" event={"ID":"f1af669d-9f68-40aa-8789-3b8166784d40","Type":"ContainerStarted","Data":"c1b78444aa41d0f257b9e41e5735241900e244ec065c3ba0484d886b4b27798e"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.077616 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf"] Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.097648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" event={"ID":"16986a56-efd6-49bb-9953-f6cf8d5b5e3d","Type":"ContainerStarted","Data":"63e7091852619aaee2b3433305329371f9db37df0ee7c7a3a837e54fb319ed47"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.157495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.158395 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.658375194 +0000 UTC m=+230.365189932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.211030 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zpv9r"] Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.211660 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" event={"ID":"759dea92-ff95-4a61-8fa3-bb23f6306128","Type":"ContainerStarted","Data":"78d6a11048b3d7627b759a91c339ee34b156b5e53ef304581391386d0b2819ef"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.257878 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4"] Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.263298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.265237 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.765217089 +0000 UTC m=+230.472031837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.298940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" event={"ID":"52534b9a-5f54-4def-baf0-e5755c7e98d2","Type":"ContainerStarted","Data":"740d4991dde321a969067cbc0dde538cba88876d0f06f1c37148fd18193dab50"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.318145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" event={"ID":"d02c0611-45a7-4760-8187-4fd2b39f7dd4","Type":"ContainerStarted","Data":"0d27f2b594f0831a1ec077d48f8ace1245c7892906a673a6d7e107ee981c823f"} Mar 10 15:09:25 crc kubenswrapper[4743]: W0310 15:09:25.338546 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47f9f6c2_2f6f_4c93_9c88_fc9db0d9b5e5.slice/crio-934fe3778d0dc5e2405a9e145c6cb61c2ff307422f06aff3f7ded196425c46fa WatchSource:0}: Error finding container 934fe3778d0dc5e2405a9e145c6cb61c2ff307422f06aff3f7ded196425c46fa: Status 404 returned error can't find the container with id 934fe3778d0dc5e2405a9e145c6cb61c2ff307422f06aff3f7ded196425c46fa Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.357396 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" event={"ID":"25bb44ff-d318-434a-82a7-0605d1fb57f2","Type":"ContainerStarted","Data":"2587d79e6fd4ea5188eb081e84cd341bb75042ca4e5c897664976aeb8be8987d"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.360914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" event={"ID":"a257bbc1-f866-4d43-9011-7ed9bd6d13e9","Type":"ContainerStarted","Data":"28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.362619 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.375440 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.377326 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.877299327 +0000 UTC m=+230.584114075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.389234 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" event={"ID":"538f8801-691a-4332-a1c7-ae0d1b570198","Type":"ContainerStarted","Data":"ee56bda5bbddb550ac37bcd37ab6a4726b29dc4a084fead7a59e8b62db58a9d2"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.442470 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-97jtg" podStartSLOduration=182.442442506 podStartE2EDuration="3m2.442442506s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:25.392576102 +0000 UTC m=+230.099390850" watchObservedRunningTime="2026-03-10 15:09:25.442442506 +0000 UTC m=+230.149257274" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.445906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" event={"ID":"2aed27d0-7067-4b00-bd27-07e71dbb0ff6","Type":"ContainerStarted","Data":"f892d30936626a03205e9a784f30c8b990ce514db8ca2f83c4728b058a65909c"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.475677 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5llhk" podStartSLOduration=182.475648334 podStartE2EDuration="3m2.475648334s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:25.466967831 +0000 UTC m=+230.173782579" watchObservedRunningTime="2026-03-10 15:09:25.475648334 +0000 UTC m=+230.182463082" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.478117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.478478 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:25.978463876 +0000 UTC m=+230.685278624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.508232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" event={"ID":"bbed94ba-a259-4809-8b9e-72a5a26eb3b1","Type":"ContainerStarted","Data":"7e2fb0affe2de7b8363d5efde24149c91c05935b68ecb4bd05261a316d49ebdf"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.513991 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" podStartSLOduration=181.513967371 podStartE2EDuration="3m1.513967371s" podCreationTimestamp="2026-03-10 15:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:25.511565141 +0000 UTC m=+230.218379879" watchObservedRunningTime="2026-03-10 15:09:25.513967371 +0000 UTC m=+230.220782119" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.528943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" event={"ID":"92680623-686c-4b48-9fcd-b5e9d558a758","Type":"ContainerStarted","Data":"91cb7068055a6d89b9301d94d528c4ecc88ca8f589b6a573942f8390e159788d"} Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.554003 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-6gjk2" podStartSLOduration=182.553982948 podStartE2EDuration="3m2.553982948s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:25.552838865 +0000 UTC m=+230.259653613" watchObservedRunningTime="2026-03-10 15:09:25.553982948 +0000 UTC m=+230.260797696" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.576807 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" podStartSLOduration=181.576782843 podStartE2EDuration="3m1.576782843s" podCreationTimestamp="2026-03-10 15:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:25.574314371 +0000 UTC m=+230.281129119" watchObservedRunningTime="2026-03-10 15:09:25.576782843 +0000 UTC m=+230.283597601" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.579266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.579564 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.079532393 +0000 UTC m=+230.786347141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.580042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.581567 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.584116 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.084099246 +0000 UTC m=+230.790913994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.647368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.702876 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.703561 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.203509968 +0000 UTC m=+230.910324716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.768192 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-65l5t" Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.806218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.806682 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.306668315 +0000 UTC m=+231.013483063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.911592 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:25 crc kubenswrapper[4743]: E0310 15:09:25.912034 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.412013597 +0000 UTC m=+231.118828345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:25 crc kubenswrapper[4743]: I0310 15:09:25.915456 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39554: no serving certificate available for the kubelet" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.013806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.014242 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.514219487 +0000 UTC m=+231.221034235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.039051 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39566: no serving certificate available for the kubelet" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.116690 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.117275 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.617252021 +0000 UTC m=+231.324066769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.117540 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.123514 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.623494053 +0000 UTC m=+231.330308801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.130310 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39582: no serving certificate available for the kubelet" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.219530 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.220202 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.719974415 +0000 UTC m=+231.426789163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.237027 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39588: no serving certificate available for the kubelet" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.323376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.323738 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.82372039 +0000 UTC m=+231.530535148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.343975 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39594: no serving certificate available for the kubelet" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.424373 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.424803 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:26.924787837 +0000 UTC m=+231.631602585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.435564 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39604: no serving certificate available for the kubelet" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.527036 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.527401 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.027380109 +0000 UTC m=+231.734194857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.536089 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39616: no serving certificate available for the kubelet" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.633713 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.634160 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.134120211 +0000 UTC m=+231.840934959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.638214 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" event={"ID":"bbed94ba-a259-4809-8b9e-72a5a26eb3b1","Type":"ContainerStarted","Data":"9a183a26a478d1507904150e984c69744d388b3bce60bd91bf6c16bea6f6df07"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.639066 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.642684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" event={"ID":"7ef8002a-10d8-4758-a110-7b31fd2fe9e1","Type":"ContainerStarted","Data":"28a686dbbde03c9fa2503b118e2e57c936cb9e461ca33440427e3bad4dd3249e"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.663924 4743 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9pdwr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.664004 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" podUID="bbed94ba-a259-4809-8b9e-72a5a26eb3b1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.668285 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39626: no serving certificate available for the kubelet" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.689777 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" podStartSLOduration=183.689752293 podStartE2EDuration="3m3.689752293s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:26.688373663 +0000 UTC m=+231.395188411" watchObservedRunningTime="2026-03-10 15:09:26.689752293 +0000 UTC m=+231.396567051" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.697282 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5vnfn" event={"ID":"cba6c5c0-919b-488d-aef0-9cc4ebbf983b","Type":"ContainerStarted","Data":"cba983177014aa255f6cea83895efd9d5c5d0313e637d9dea8c41749c2fb739c"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.717941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" event={"ID":"16986a56-efd6-49bb-9953-f6cf8d5b5e3d","Type":"ContainerStarted","Data":"e5f5fe97842a2cb6acac99a6a10cad240a3968ef4d9339bb776d35b2beedd0f3"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.758426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.762357 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.262305868 +0000 UTC m=+231.969120616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.772647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" event={"ID":"bf5b17f3-18e2-4bf5-a48f-d752c54fc4dd","Type":"ContainerStarted","Data":"2b018ec15ca6941a3b5a7b8aedc2ce54a49115f5f799b64041413c5dba8ecf9e"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.796152 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5vnfn" podStartSLOduration=6.796101033 podStartE2EDuration="6.796101033s" podCreationTimestamp="2026-03-10 15:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:26.73252548 +0000 UTC m=+231.439340228" watchObservedRunningTime="2026-03-10 15:09:26.796101033 +0000 UTC m=+231.502915791" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.802920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" event={"ID":"9cbc1d80-bce4-4e58-873d-6723e2020c66","Type":"ContainerStarted","Data":"d878c90f91b11587ec766ef832410fa0f80646c3f97d94fcb0605992830cf97e"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.854075 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b44lp" podStartSLOduration=183.854041922 podStartE2EDuration="3m3.854041922s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:26.842791514 +0000 UTC m=+231.549606262" watchObservedRunningTime="2026-03-10 15:09:26.854041922 +0000 UTC m=+231.560856670" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.858890 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" event={"ID":"9164077f-fddc-43e6-9aac-23a8be818d9f","Type":"ContainerStarted","Data":"19d23ff6fa2796978a844d0c9165002eedc6b343d42db71f24adbca6f6ea9f3f"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.860152 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.870030 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.873627 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.373595822 +0000 UTC m=+232.080410570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.886915 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wxcl8" event={"ID":"01241a54-7e00-4d18-b407-4278620b7ac7","Type":"ContainerStarted","Data":"1a218cd0abf8239ca4a63fbb2196d1ba39f76c4adf69c9e2ffad376c066d297f"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.946384 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" event={"ID":"19999585-e753-49c3-b70d-c8b5a406a823","Type":"ContainerStarted","Data":"49d719c1f70fec38298777b8613f76e28a69627236ea802f99c008bc6c322db1"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.946441 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" event={"ID":"19999585-e753-49c3-b70d-c8b5a406a823","Type":"ContainerStarted","Data":"45d3168fc74201d75b77cf2e29537d0efeb997164aee6721bae1d1a8abddd3c7"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.970460 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" podStartSLOduration=183.970386464 podStartE2EDuration="3m3.970386464s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:26.964257045 +0000 UTC m=+231.671071793" watchObservedRunningTime="2026-03-10 15:09:26.970386464 +0000 UTC m=+231.677201212" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.970731 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2d45q" podStartSLOduration=182.970726424 podStartE2EDuration="3m2.970726424s" podCreationTimestamp="2026-03-10 15:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:26.911380814 +0000 UTC m=+231.618195562" watchObservedRunningTime="2026-03-10 15:09:26.970726424 +0000 UTC m=+231.677541172" Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.972613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" event={"ID":"3c0b5287-3555-4c8f-a6cc-7e689b3046e1","Type":"ContainerStarted","Data":"96751a8473c453ac22d10b355ba006dfd214774cf719c3f664f331fac5a23993"} Mar 10 15:09:26 crc kubenswrapper[4743]: I0310 15:09:26.974233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:26 crc kubenswrapper[4743]: E0310 15:09:26.974773 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.474754041 +0000 UTC m=+232.181568789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.007642 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lvzgw" podStartSLOduration=184.00761541 podStartE2EDuration="3m4.00761541s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.006080845 +0000 UTC m=+231.712895593" watchObservedRunningTime="2026-03-10 15:09:27.00761541 +0000 UTC m=+231.714430158" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.030563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" event={"ID":"2aed27d0-7067-4b00-bd27-07e71dbb0ff6","Type":"ContainerStarted","Data":"a8c396402ef085fed17314bc243bc22167413ebfeecee9f9801374e941651f8f"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.033115 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.069652 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" podStartSLOduration=184.069622207 podStartE2EDuration="3m4.069622207s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.052239911 +0000 UTC m=+231.759054659" watchObservedRunningTime="2026-03-10 15:09:27.069622207 +0000 UTC m=+231.776436955" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.076634 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.078086 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.578059953 +0000 UTC m=+232.284874711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.087980 4743 generic.go:334] "Generic (PLEG): container finished" podID="25bb44ff-d318-434a-82a7-0605d1fb57f2" containerID="943310b960bc0db5642548370a9348c6759a72188921271cc132f1caeae60d63" exitCode=0 Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.088336 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" event={"ID":"25bb44ff-d318-434a-82a7-0605d1fb57f2","Type":"ContainerDied","Data":"943310b960bc0db5642548370a9348c6759a72188921271cc132f1caeae60d63"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.092322 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.111140 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.139928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" event={"ID":"ba41eb29-8687-44ad-8001-642d0ff1fd7f","Type":"ContainerStarted","Data":"8e0f192db5deba03e306442575a0b6730c9271dc4a719a595fe0fcb030396891"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.141180 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.148880 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-l66d6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.148933 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.164300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" event={"ID":"d02c0611-45a7-4760-8187-4fd2b39f7dd4","Type":"ContainerStarted","Data":"b7c776f86be34eb1bca2c288e8b8c056e71390659d289886aae39a4457beb6c1"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.175073 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" podStartSLOduration=184.175038231 podStartE2EDuration="3m4.175038231s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.157576662 +0000 UTC m=+231.864391410" watchObservedRunningTime="2026-03-10 15:09:27.175038231 +0000 UTC m=+231.881852979" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.185801 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.187695 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.68768246 +0000 UTC m=+232.394497208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.193534 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" event={"ID":"7837cec9-3686-497f-b9ec-2525768cd8ce","Type":"ContainerStarted","Data":"800d2fb78eca325b4edd7282a23ace9db1ba411818bead60dc6de13b9d5bc041"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.246135 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" event={"ID":"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89","Type":"ContainerStarted","Data":"e5ca210bdab9aeadfaaa4fdaccea09d81610e74728e650ea7841c4c954e71317"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.246198 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" event={"ID":"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89","Type":"ContainerStarted","Data":"ac8cf216071eb114f067243d0f5bd94eead414349c06d11190bcdecf9fc94245"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.251571 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" podStartSLOduration=184.251555332 podStartE2EDuration="3m4.251555332s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.243055454 +0000 UTC m=+231.949870202" watchObservedRunningTime="2026-03-10 15:09:27.251555332 +0000 UTC m=+231.958370080" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.288521 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.289879 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.789850409 +0000 UTC m=+232.496665157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.290062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vfhfq" event={"ID":"538f8801-691a-4332-a1c7-ae0d1b570198","Type":"ContainerStarted","Data":"c7078b2117e892ba67305bf0d8c358cc0a38a7e4d8fbd5cac1ba7ad8425f5ced"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.359393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-59rpz" event={"ID":"ec0a0850-2f3c-4a27-a08c-0820a360ace9","Type":"ContainerStarted","Data":"81617c17e5700810009a399c3312a5626a278a016060462991ce788229581e52"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.360797 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-59rpz" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.368126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" event={"ID":"afc92a5c-a4ef-4ae8-9425-9787ea43ca0a","Type":"ContainerStarted","Data":"a3e17ad7dfebe276c06ec6b4f73950c24581a5d8d17c6caffcb01083b6678bca"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.368196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" event={"ID":"afc92a5c-a4ef-4ae8-9425-9787ea43ca0a","Type":"ContainerStarted","Data":"006d1dcf5f795d09a4132dc70c313b476a6362e0d8678061d8c97d733c2cb0fd"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.390747 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" event={"ID":"115e302d-a5bc-4198-a214-7aeb0e74f1cd","Type":"ContainerStarted","Data":"ca09b42e4029fc0abb2c82c42620bbbc93cc800acd50086b813b6bfeda40fea0"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.391241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.398100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.402138 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:27.902114832 +0000 UTC m=+232.608929580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.414252 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39628: no serving certificate available for the kubelet" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.417292 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" podStartSLOduration=184.417245733 podStartE2EDuration="3m4.417245733s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.356266025 +0000 UTC m=+232.063080773" watchObservedRunningTime="2026-03-10 15:09:27.417245733 +0000 UTC m=+232.124060481" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.430345 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-59rpz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.430399 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-59rpz" podUID="ec0a0850-2f3c-4a27-a08c-0820a360ace9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.433348 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" podStartSLOduration=184.433314311 podStartE2EDuration="3m4.433314311s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.394352775 +0000 UTC m=+232.101167523" watchObservedRunningTime="2026-03-10 15:09:27.433314311 +0000 UTC m=+232.140129059" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.467453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" event={"ID":"a65120a3-0d66-481a-9f09-9f338a85cbc4","Type":"ContainerStarted","Data":"8aa1630ccf552dd118f9bf95c7a4cfecc9667834825481eecad65bfbc8b768c9"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.467517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" event={"ID":"a65120a3-0d66-481a-9f09-9f338a85cbc4","Type":"ContainerStarted","Data":"0cb45b1cefae4861a37939e4704f5005e2dbab39353f10cb4dd40ea5bee8eaf1"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.469120 4743 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hhfsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.469158 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" podUID="115e302d-a5bc-4198-a214-7aeb0e74f1cd" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.471387 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr4wq" podStartSLOduration=184.471374311 podStartE2EDuration="3m4.471374311s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.468978251 +0000 UTC m=+232.175792999" watchObservedRunningTime="2026-03-10 15:09:27.471374311 +0000 UTC m=+232.178189059" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.514021 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.514994 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.014975572 +0000 UTC m=+232.721790320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.530770 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-59rpz" podStartSLOduration=184.530741962 podStartE2EDuration="3m4.530741962s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.528504127 +0000 UTC m=+232.235318875" watchObservedRunningTime="2026-03-10 15:09:27.530741962 +0000 UTC m=+232.237556710" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.567054 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" event={"ID":"edc06df9-1e96-4bb1-892a-d9fa1bcd6341","Type":"ContainerStarted","Data":"bab7a9fdb1272f58d89a90a638621f5488617162bfbc0c81f91727ad3fc26263"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.577410 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" podStartSLOduration=184.577393692 podStartE2EDuration="3m4.577393692s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.576542017 +0000 UTC m=+232.283356765" watchObservedRunningTime="2026-03-10 15:09:27.577393692 +0000 UTC m=+232.284208440" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.602100 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" event={"ID":"54c73693-6f88-4967-922a-7d0521a41343","Type":"ContainerStarted","Data":"f7feaaed1ad28176c32af73d7e2bb250f9b3a074fc6c81d70a48fa3eb562d518"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.616849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.617986 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.117970335 +0000 UTC m=+232.824785083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.619697 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" podStartSLOduration=184.619682815 podStartE2EDuration="3m4.619682815s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.618874362 +0000 UTC m=+232.325689110" watchObservedRunningTime="2026-03-10 15:09:27.619682815 +0000 UTC m=+232.326497553" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.648344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" event={"ID":"daf9e0f1-18ae-4197-b6b1-9a11439768b8","Type":"ContainerStarted","Data":"e16861fa9724ef4f75c7e1ced83a396b5d3044200dccdecec624f60c8e6cfa00"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.672168 4743 generic.go:334] "Generic (PLEG): container finished" podID="320568c9-bd4b-4ebd-a575-650bcdd5d104" containerID="797b8e9130ca6889d40b37be995e0d3e0e84a146fd7ff503e36d2a999c300972" exitCode=0 Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.672278 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-658rk" event={"ID":"320568c9-bd4b-4ebd-a575-650bcdd5d104","Type":"ContainerDied","Data":"797b8e9130ca6889d40b37be995e0d3e0e84a146fd7ff503e36d2a999c300972"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.688942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nsqd8" event={"ID":"1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48","Type":"ContainerStarted","Data":"3c3ac760a92239707dec8d989ddb5692ad6bffb840bc24ec37dcf27d1d97e376"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.718655 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.718692 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" podStartSLOduration=184.718671101 podStartE2EDuration="3m4.718671101s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.675341258 +0000 UTC m=+232.382156156" watchObservedRunningTime="2026-03-10 15:09:27.718671101 +0000 UTC m=+232.425485849" Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.719028 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.218999741 +0000 UTC m=+232.925814489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.719500 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.721433 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.221415371 +0000 UTC m=+232.928230319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.736030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" event={"ID":"746f0851-522d-4354-9be2-0d370e2af3a2","Type":"ContainerStarted","Data":"b2abd55ed92bffffbcc10d4a3bd4c9ed5154a88c4162d2a559d3caf92555b373"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.742585 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" podStartSLOduration=184.742559938 podStartE2EDuration="3m4.742559938s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.740957941 +0000 UTC m=+232.447772689" watchObservedRunningTime="2026-03-10 15:09:27.742559938 +0000 UTC m=+232.449374686" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.764433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zpv9r" event={"ID":"47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5","Type":"ContainerStarted","Data":"934fe3778d0dc5e2405a9e145c6cb61c2ff307422f06aff3f7ded196425c46fa"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.779519 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" event={"ID":"bd2b8b7d-a9b2-4299-8253-7c71ab4e0619","Type":"ContainerStarted","Data":"bc167c1fcc48ec5920db7e7460ff8f638acb20a7dc249d51fc2adc6c9091527e"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.793201 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nsqd8" podStartSLOduration=184.793175434 podStartE2EDuration="3m4.793175434s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.79134138 +0000 UTC m=+232.498156118" watchObservedRunningTime="2026-03-10 15:09:27.793175434 +0000 UTC m=+232.499990182" Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.796848 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" event={"ID":"5ed481d4-44a9-41b0-a0f1-32360dc3cb85","Type":"ContainerStarted","Data":"c04e9ebd7a31090957b2aa93b7449e64775c168826718c4b601f9ac46d232a4a"} Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.822491 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.824240 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.324208728 +0000 UTC m=+233.031023626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.924997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:27 crc kubenswrapper[4743]: E0310 15:09:27.936217 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.436198904 +0000 UTC m=+233.143013652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:27 crc kubenswrapper[4743]: I0310 15:09:27.965253 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" podStartSLOduration=184.96523161 podStartE2EDuration="3m4.96523161s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.900155923 +0000 UTC m=+232.606970671" watchObservedRunningTime="2026-03-10 15:09:27.96523161 +0000 UTC m=+232.672046358" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.027873 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.028226 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.528206616 +0000 UTC m=+233.235021364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.043649 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zpv9r" podStartSLOduration=8.043626346 podStartE2EDuration="8.043626346s" podCreationTimestamp="2026-03-10 15:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:27.965597491 +0000 UTC m=+232.672412229" watchObservedRunningTime="2026-03-10 15:09:28.043626346 +0000 UTC m=+232.750441094" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.090430 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" podStartSLOduration=185.09041074 podStartE2EDuration="3m5.09041074s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:28.04616885 +0000 UTC m=+232.752983598" watchObservedRunningTime="2026-03-10 15:09:28.09041074 +0000 UTC m=+232.797225488" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.090804 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhw4m" podStartSLOduration=185.090798681 podStartE2EDuration="3m5.090798681s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:28.088301578 +0000 UTC m=+232.795116326" watchObservedRunningTime="2026-03-10 15:09:28.090798681 +0000 UTC m=+232.797613429" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.111117 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.119087 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:28 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:28 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:28 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.119157 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.131958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.132358 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.632342412 +0000 UTC m=+233.339157160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.233439 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.234011 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.733988716 +0000 UTC m=+233.440803464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.335309 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.335743 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.835702162 +0000 UTC m=+233.542516910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.436744 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.437006 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.936969064 +0000 UTC m=+233.643783802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.437458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.437990 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:28.937974413 +0000 UTC m=+233.644789161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.541393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.541943 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.041921464 +0000 UTC m=+233.748736212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.643077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.643434 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.143417433 +0000 UTC m=+233.850232181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.745021 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.745782 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.245760807 +0000 UTC m=+233.952575555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.775397 4743 ???:1] "http: TLS handshake error from 192.168.126.11:39642: no serving certificate available for the kubelet" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.830791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" event={"ID":"5ddaf9f3-b2a3-4a5c-9514-1e3c47d47932","Type":"ContainerStarted","Data":"b0e36869cd127003dd8896903ba30bee5ab2b94921882a430583b46f1df612dd"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.847609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.848202 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.348179203 +0000 UTC m=+234.054993951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.856120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wxcl8" event={"ID":"01241a54-7e00-4d18-b407-4278620b7ac7","Type":"ContainerStarted","Data":"90c21a9767beafbd2941dac8d9c2f6fcce95b5c083096176b77b2deb39525bd8"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.856177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wxcl8" event={"ID":"01241a54-7e00-4d18-b407-4278620b7ac7","Type":"ContainerStarted","Data":"ad99875ef617506a6413bfbab49604c949e6fbe44b1e5342268b0db30e3a7cf6"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.856643 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.863331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" event={"ID":"edc06df9-1e96-4bb1-892a-d9fa1bcd6341","Type":"ContainerStarted","Data":"d1873c1f5b9e54f9ce5d181569cc26f41b41e325a58f781a44f8caf9739c1951"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.863408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9tw2t" event={"ID":"edc06df9-1e96-4bb1-892a-d9fa1bcd6341","Type":"ContainerStarted","Data":"81af362e985e93d40ab7e15f8f4081dc943f6e343b2045454fcd1f2a6cca786c"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.867381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" event={"ID":"92680623-686c-4b48-9fcd-b5e9d558a758","Type":"ContainerStarted","Data":"13e776891ec203af7221f937592b977d56a03892877fab8c84ec295587a09b0a"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.867442 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" event={"ID":"92680623-686c-4b48-9fcd-b5e9d558a758","Type":"ContainerStarted","Data":"cb0cb67fbe05e87c4bb5c0a61cbcfb2ffa81f12a0a25328420dfcada857b05f0"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.889139 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-w4rxv" podStartSLOduration=185.889122067 podStartE2EDuration="3m5.889122067s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:28.888239742 +0000 UTC m=+233.595054480" watchObservedRunningTime="2026-03-10 15:09:28.889122067 +0000 UTC m=+233.595936815" Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.893479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" event={"ID":"54c73693-6f88-4967-922a-7d0521a41343","Type":"ContainerStarted","Data":"afbba0ba477fd90a76fb9b9fc9bae627df965dc0a35c3c7387509a49fecda52d"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.928276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlm2p" event={"ID":"d02c0611-45a7-4760-8187-4fd2b39f7dd4","Type":"ContainerStarted","Data":"b05544163f82d9fc2d3ee1e0243c8149dd74ed4ec30c9f3d243f2d8f1887d20f"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.949457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:28 crc kubenswrapper[4743]: E0310 15:09:28.951166 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.451144676 +0000 UTC m=+234.157959424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.961229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" event={"ID":"57bc2216-e8f7-4056-afa8-1daa3daf04db","Type":"ContainerStarted","Data":"073953270c6697ab821fba777a166bf6a0f2202df552883d2a4bbf2b43e30288"} Mar 10 15:09:28 crc kubenswrapper[4743]: I0310 15:09:28.971495 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmmfs" podStartSLOduration=185.971465758 podStartE2EDuration="3m5.971465758s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:28.928834725 +0000 UTC m=+233.635649473" watchObservedRunningTime="2026-03-10 15:09:28.971465758 +0000 UTC m=+233.678280506" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.009108 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tkzf4" podStartSLOduration=186.009082535 podStartE2EDuration="3m6.009082535s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:29.005455849 +0000 UTC m=+233.712270597" watchObservedRunningTime="2026-03-10 15:09:29.009082535 +0000 UTC m=+233.715897283" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.009488 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wxcl8" podStartSLOduration=9.009483577 podStartE2EDuration="9.009483577s" podCreationTimestamp="2026-03-10 15:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:28.973346543 +0000 UTC m=+233.680161291" watchObservedRunningTime="2026-03-10 15:09:29.009483577 +0000 UTC m=+233.716298325" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.019621 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" event={"ID":"25bb44ff-d318-434a-82a7-0605d1fb57f2","Type":"ContainerStarted","Data":"482f92a3e8c9cc7e2014d0e7632a05904a8c3205925b880c15c301128416f8a8"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.041839 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" event={"ID":"16986a56-efd6-49bb-9953-f6cf8d5b5e3d","Type":"ContainerStarted","Data":"69798162aee1f91c50cb95a29cf7346f8c5714655194277ccf5ec7100aa66193"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.053233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.054737 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.554717405 +0000 UTC m=+234.261532153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.068480 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xbvz2" event={"ID":"5ed481d4-44a9-41b0-a0f1-32360dc3cb85","Type":"ContainerStarted","Data":"587ac7720f95c8dbf3aa8e158d526818cdc611ff88774152d365f2a8811a6d1e"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.085946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" event={"ID":"3c0b5287-3555-4c8f-a6cc-7e689b3046e1","Type":"ContainerStarted","Data":"c22f0be175b798cd1f182e6f988a663e528235adda42da97b89a063cebd94c99"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.087400 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-48kwm" podStartSLOduration=186.087372538 podStartE2EDuration="3m6.087372538s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:29.053411067 +0000 UTC m=+233.760225805" watchObservedRunningTime="2026-03-10 15:09:29.087372538 +0000 UTC m=+233.794187286" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.098986 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wrn4x" podStartSLOduration=186.098965936 podStartE2EDuration="3m6.098965936s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:29.086333937 +0000 UTC m=+233.793148685" watchObservedRunningTime="2026-03-10 15:09:29.098965936 +0000 UTC m=+233.805780684" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.108796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" event={"ID":"115e302d-a5bc-4198-a214-7aeb0e74f1cd","Type":"ContainerStarted","Data":"967e7223259476942bd7aa5d616fb197e69bc2f1c4db4e847c5da0723f2a16bd"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.114423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" event={"ID":"746f0851-522d-4354-9be2-0d370e2af3a2","Type":"ContainerStarted","Data":"769e5c252cd3b2479401972984440a3dcbdcd63059417bd0e86b212435c2502c"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.114466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lrgf6" event={"ID":"746f0851-522d-4354-9be2-0d370e2af3a2","Type":"ContainerStarted","Data":"1591096a1decd1bcd2f8141883aa786406621d6ecd03145cb118f1080d5ee5af"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.123077 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:29 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:29 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:29 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.123153 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.140941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" event={"ID":"ba41eb29-8687-44ad-8001-642d0ff1fd7f","Type":"ContainerStarted","Data":"b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.142087 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-l66d6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.142168 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.154571 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.155856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" event={"ID":"5a2b935b-dc0d-4fec-9869-2a124ce4c274","Type":"ContainerStarted","Data":"90474b8dcf488bf8c8f4d3499c92aa1bf089f1658526490f05f18aa626cb1a02"} Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.156542 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.656496233 +0000 UTC m=+234.363310981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.170236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" event={"ID":"7ef8002a-10d8-4758-a110-7b31fd2fe9e1","Type":"ContainerStarted","Data":"1e0d5461091abf3f69be3dc1765cf0f7722f1ce4e201dde4d48dc023bea26bc3"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.171405 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.175192 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhfsf" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.194089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zpv9r" event={"ID":"47f9f6c2-2f6f-4c93-9c88-fc9db0d9b5e5","Type":"ContainerStarted","Data":"0eebbdbd0544a7e5f18490baacc26384f8a80c8e0c759071f7e4a59d675cd502"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.194633 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" podStartSLOduration=186.194608884 podStartE2EDuration="3m6.194608884s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:29.193036158 +0000 UTC m=+233.899850906" watchObservedRunningTime="2026-03-10 15:09:29.194608884 +0000 UTC m=+233.901423632" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.197433 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.199566 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" event={"ID":"daf9e0f1-18ae-4197-b6b1-9a11439768b8","Type":"ContainerStarted","Data":"d57770921edcade0c66dfd0e5da2ea57769c42cd0fb0a49f8d8149e396ada4c0"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.199592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9ssn" event={"ID":"daf9e0f1-18ae-4197-b6b1-9a11439768b8","Type":"ContainerStarted","Data":"cac33b3ff50008aeff62aae9d8b1bc7c607c291111d431c823fecbaf83da2e74"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.210521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-658rk" event={"ID":"320568c9-bd4b-4ebd-a575-650bcdd5d104","Type":"ContainerStarted","Data":"80f40fe6717e039401e74ee7fd814d727ec76d329f45f7f5a0f6177722a407bd"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.229987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jr75n" event={"ID":"a65120a3-0d66-481a-9f09-9f338a85cbc4","Type":"ContainerStarted","Data":"eba8bead348e10a8d0932886c0660ea07f8af1b8f5e60d49c9fd695651c74f8f"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.234285 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cdt8j"] Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.239494 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" event={"ID":"2aed27d0-7067-4b00-bd27-07e71dbb0ff6","Type":"ContainerStarted","Data":"a3a845d9947d917c393093fc69aacbe99ff630c3857af8eb1a1e8359669bdfb0"} Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.240989 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-59rpz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.241054 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-59rpz" podUID="ec0a0850-2f3c-4a27-a08c-0820a360ace9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.241503 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" podUID="4370e3f8-d9d3-48c8-a10f-d19c28342bb6" containerName="controller-manager" containerID="cri-o://2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd" gracePeriod=30 Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.260394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.261012 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.760985979 +0000 UTC m=+234.467800897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.276312 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg"] Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.297562 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66vbz" podStartSLOduration=186.297539085 podStartE2EDuration="3m6.297539085s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:29.290275023 +0000 UTC m=+233.997089771" watchObservedRunningTime="2026-03-10 15:09:29.297539085 +0000 UTC m=+234.004353833" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.362997 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.365277 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.865248099 +0000 UTC m=+234.572062847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.470047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.470671 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:29.970650582 +0000 UTC m=+234.677465330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.470995 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-658rk" podStartSLOduration=186.470980862 podStartE2EDuration="3m6.470980862s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:29.470076766 +0000 UTC m=+234.176891514" watchObservedRunningTime="2026-03-10 15:09:29.470980862 +0000 UTC m=+234.177795610" Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.574662 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.575137 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.075102668 +0000 UTC m=+234.781917556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.677053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.677561 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.177539814 +0000 UTC m=+234.884354562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.778417 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.778518 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.278492558 +0000 UTC m=+234.985307306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.778844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.779166 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.279157557 +0000 UTC m=+234.985972305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.879831 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.880137 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.3801004 +0000 UTC m=+235.086915148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.880363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.880770 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.38076122 +0000 UTC m=+235.087575968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:29 crc kubenswrapper[4743]: I0310 15:09:29.983609 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:29 crc kubenswrapper[4743]: E0310 15:09:29.984118 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.484097853 +0000 UTC m=+235.190912601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.020783 4743 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-qrqrp container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.020888 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" podUID="25bb44ff-d318-434a-82a7-0605d1fb57f2" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.087708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.088138 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.588124086 +0000 UTC m=+235.294938834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.113566 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:30 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:30 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:30 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.113626 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.183255 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.190013 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.190313 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.690276874 +0000 UTC m=+235.397091612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.190527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.191025 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.691007805 +0000 UTC m=+235.397822553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.193025 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9pdwr" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.251346 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" event={"ID":"3c0b5287-3555-4c8f-a6cc-7e689b3046e1","Type":"ContainerStarted","Data":"6807f97f09e7262be9558e01fe711929f1fb46132855c2713d9ff08041ae1c68"} Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.263472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-658rk" event={"ID":"320568c9-bd4b-4ebd-a575-650bcdd5d104","Type":"ContainerStarted","Data":"f3b85a044256a9e5b732c1c1510157ac991f30beba41b1fe2d7c9e27a9b55b15"} Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.275126 4743 generic.go:334] "Generic (PLEG): container finished" podID="4370e3f8-d9d3-48c8-a10f-d19c28342bb6" containerID="2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd" exitCode=0 Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.278517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" event={"ID":"4370e3f8-d9d3-48c8-a10f-d19c28342bb6","Type":"ContainerDied","Data":"2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd"} Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.279395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" event={"ID":"4370e3f8-d9d3-48c8-a10f-d19c28342bb6","Type":"ContainerDied","Data":"8f32dc78fc8fd0289edd870ce561d3542ca62cd3f1e46c12b7e520bac61b7f78"} Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.279426 4743 scope.go:117] "RemoveContainer" containerID="2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.278644 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cdt8j" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.282716 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-59rpz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.282782 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-59rpz" podUID="ec0a0850-2f3c-4a27-a08c-0820a360ace9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.283302 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-l66d6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.283353 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.283549 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8qbmf"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.284704 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" podUID="a257bbc1-f866-4d43-9011-7ed9bd6d13e9" containerName="route-controller-manager" containerID="cri-o://28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd" gracePeriod=30 Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.285483 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4370e3f8-d9d3-48c8-a10f-d19c28342bb6" containerName="controller-manager" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.285510 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4370e3f8-d9d3-48c8-a10f-d19c28342bb6" containerName="controller-manager" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.285673 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4370e3f8-d9d3-48c8-a10f-d19c28342bb6" containerName="controller-manager" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.286734 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.291893 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-serving-cert\") pod \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.292047 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.292179 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.792155904 +0000 UTC m=+235.498970652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.292205 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42945\" (UniqueName: \"kubernetes.io/projected/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-kube-api-access-42945\") pod \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.292243 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-proxy-ca-bundles\") pod \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.292566 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-config\") pod \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.292591 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-client-ca\") pod \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\" (UID: \"4370e3f8-d9d3-48c8-a10f-d19c28342bb6\") " Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.293702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.294022 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.794014689 +0000 UTC m=+235.500829437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.296446 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.306589 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-client-ca" (OuterVolumeSpecName: "client-ca") pod "4370e3f8-d9d3-48c8-a10f-d19c28342bb6" (UID: "4370e3f8-d9d3-48c8-a10f-d19c28342bb6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.308414 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-config" (OuterVolumeSpecName: "config") pod "4370e3f8-d9d3-48c8-a10f-d19c28342bb6" (UID: "4370e3f8-d9d3-48c8-a10f-d19c28342bb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.310410 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4370e3f8-d9d3-48c8-a10f-d19c28342bb6" (UID: "4370e3f8-d9d3-48c8-a10f-d19c28342bb6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.310507 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qbmf"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.323261 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-kube-api-access-42945" (OuterVolumeSpecName: "kube-api-access-42945") pod "4370e3f8-d9d3-48c8-a10f-d19c28342bb6" (UID: "4370e3f8-d9d3-48c8-a10f-d19c28342bb6"). InnerVolumeSpecName "kube-api-access-42945". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.335726 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4370e3f8-d9d3-48c8-a10f-d19c28342bb6" (UID: "4370e3f8-d9d3-48c8-a10f-d19c28342bb6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.395007 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.395552 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.895524137 +0000 UTC m=+235.602338885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.396911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.397419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-catalog-content\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.397826 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-utilities\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.399984 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:30.899962197 +0000 UTC m=+235.606776945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.401614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzd7j\" (UniqueName: \"kubernetes.io/projected/5fb16ec3-618c-4095-a3e7-3f59920d921b-kube-api-access-gzd7j\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.425225 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.425269 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.425298 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.425319 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42945\" (UniqueName: \"kubernetes.io/projected/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-kube-api-access-42945\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.425331 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4370e3f8-d9d3-48c8-a10f-d19c28342bb6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.430415 4743 scope.go:117] "RemoveContainer" containerID="2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.431928 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd\": container with ID starting with 2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd not found: ID does not exist" containerID="2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.432010 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd"} err="failed to get container status \"2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd\": rpc error: code = NotFound desc = could not find container \"2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd\": container with ID starting with 2ae1f0097667889d51da3bb06a6da19104f31f6f6a0a15999dc1a8660f1354fd not found: ID does not exist" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.468261 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f6tgf"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.472803 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.482791 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.483682 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6tgf"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.526489 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.526841 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:31.026799715 +0000 UTC m=+235.733614463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.526958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzd7j\" (UniqueName: \"kubernetes.io/projected/5fb16ec3-618c-4095-a3e7-3f59920d921b-kube-api-access-gzd7j\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.527135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.527180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m29gj\" (UniqueName: \"kubernetes.io/projected/d815b880-6675-42e2-8380-3e1aaae065a7-kube-api-access-m29gj\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.527203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-catalog-content\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.527227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-utilities\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.529426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-catalog-content\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.529564 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-utilities\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.530112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-utilities\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.530459 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:31.030444731 +0000 UTC m=+235.737259479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.530909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-catalog-content\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.564184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzd7j\" (UniqueName: \"kubernetes.io/projected/5fb16ec3-618c-4095-a3e7-3f59920d921b-kube-api-access-gzd7j\") pod \"community-operators-8qbmf\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.630481 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.630849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m29gj\" (UniqueName: \"kubernetes.io/projected/d815b880-6675-42e2-8380-3e1aaae065a7-kube-api-access-m29gj\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.630877 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-catalog-content\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.630893 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-utilities\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.631392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-utilities\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.631478 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:31.131455816 +0000 UTC m=+235.838270564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.632004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-catalog-content\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.649620 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cdt8j"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.649683 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cdt8j"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.652169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m29gj\" (UniqueName: \"kubernetes.io/projected/d815b880-6675-42e2-8380-3e1aaae065a7-kube-api-access-m29gj\") pod \"certified-operators-f6tgf\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.660583 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.664287 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h4q2k"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.669489 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.678559 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h4q2k"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.725867 4743 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.736961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq52p\" (UniqueName: \"kubernetes.io/projected/ec21919b-a512-42f8-b1ce-80498821cb65-kube-api-access-lq52p\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.737059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-catalog-content\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.737093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-utilities\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.737242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.737952 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:31.237928381 +0000 UTC m=+235.944743129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.817356 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qrqrp" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.841328 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.841541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.841753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq52p\" (UniqueName: \"kubernetes.io/projected/ec21919b-a512-42f8-b1ce-80498821cb65-kube-api-access-lq52p\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.841806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-catalog-content\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.841859 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-utilities\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.841994 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:31.341956584 +0000 UTC m=+236.048771322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.843495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-utilities\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.844727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-catalog-content\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.864067 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mkgdp"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.865198 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.878535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq52p\" (UniqueName: \"kubernetes.io/projected/ec21919b-a512-42f8-b1ce-80498821cb65-kube-api-access-lq52p\") pod \"community-operators-h4q2k\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.916863 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mkgdp"] Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.935584 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.943182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.943271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-catalog-content\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.943298 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9z62\" (UniqueName: \"kubernetes.io/projected/2b8d4a30-a71d-48ca-b702-772f0e08c566-kube-api-access-s9z62\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.943322 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-utilities\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.943692 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:31.443679449 +0000 UTC m=+236.150494197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mxlth" (UID: "199e5a98-b472-45af-9088-ffe163ceba78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.969539 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9"] Mar 10 15:09:30 crc kubenswrapper[4743]: E0310 15:09:30.969794 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a257bbc1-f866-4d43-9011-7ed9bd6d13e9" containerName="route-controller-manager" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.969807 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a257bbc1-f866-4d43-9011-7ed9bd6d13e9" containerName="route-controller-manager" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.969978 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a257bbc1-f866-4d43-9011-7ed9bd6d13e9" containerName="route-controller-manager" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.970373 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.978473 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.981487 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.981690 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.984800 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.985607 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.985849 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:09:30 crc kubenswrapper[4743]: I0310 15:09:30.991569 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.014292 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.016153 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9"] Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.019450 4743 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T15:09:30.726149137Z","Handler":null,"Name":""} Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.027946 4743 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.027981 4743 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.044748 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-serving-cert\") pod \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.044944 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-config\") pod \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-client-ca\") pod \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045311 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc5x8\" (UniqueName: \"kubernetes.io/projected/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-kube-api-access-kc5x8\") pod \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\" (UID: \"a257bbc1-f866-4d43-9011-7ed9bd6d13e9\") " Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-client-ca\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lmhj\" (UniqueName: \"kubernetes.io/projected/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-kube-api-access-8lmhj\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045683 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-serving-cert\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045721 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-catalog-content\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9z62\" (UniqueName: \"kubernetes.io/projected/2b8d4a30-a71d-48ca-b702-772f0e08c566-kube-api-access-s9z62\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.045787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-config\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.046215 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-proxy-ca-bundles\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.046256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-utilities\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.046324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "a257bbc1-f866-4d43-9011-7ed9bd6d13e9" (UID: "a257bbc1-f866-4d43-9011-7ed9bd6d13e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.046916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-utilities\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.047269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-catalog-content\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.049699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.055389 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-config" (OuterVolumeSpecName: "config") pod "a257bbc1-f866-4d43-9011-7ed9bd6d13e9" (UID: "a257bbc1-f866-4d43-9011-7ed9bd6d13e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.056077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-kube-api-access-kc5x8" (OuterVolumeSpecName: "kube-api-access-kc5x8") pod "a257bbc1-f866-4d43-9011-7ed9bd6d13e9" (UID: "a257bbc1-f866-4d43-9011-7ed9bd6d13e9"). InnerVolumeSpecName "kube-api-access-kc5x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.056357 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a257bbc1-f866-4d43-9011-7ed9bd6d13e9" (UID: "a257bbc1-f866-4d43-9011-7ed9bd6d13e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.085276 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9z62\" (UniqueName: \"kubernetes.io/projected/2b8d4a30-a71d-48ca-b702-772f0e08c566-kube-api-access-s9z62\") pod \"certified-operators-mkgdp\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.118182 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:31 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:31 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:31 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.118262 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lmhj\" (UniqueName: \"kubernetes.io/projected/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-kube-api-access-8lmhj\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147172 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-serving-cert\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-config\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-proxy-ca-bundles\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147339 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-client-ca\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147451 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147462 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc5x8\" (UniqueName: \"kubernetes.io/projected/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-kube-api-access-kc5x8\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147472 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.147481 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a257bbc1-f866-4d43-9011-7ed9bd6d13e9-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.151658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-proxy-ca-bundles\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.152632 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-config\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.153511 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.153583 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.154201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-client-ca\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.179992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-serving-cert\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.180152 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lmhj\" (UniqueName: \"kubernetes.io/projected/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-kube-api-access-8lmhj\") pod \"controller-manager-7cb6d674bb-g9sc9\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.225097 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.246759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mxlth\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.302196 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qbmf"] Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.307791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" event={"ID":"3c0b5287-3555-4c8f-a6cc-7e689b3046e1","Type":"ContainerStarted","Data":"138c81a9e165eb1ffe3348469f917074b3a080f9f49b0bb65ec85def248c3dad"} Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.307873 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" event={"ID":"3c0b5287-3555-4c8f-a6cc-7e689b3046e1","Type":"ContainerStarted","Data":"8d4ec209d5d4ad3afd08f7d0b8f48bdaee1ba1d3fa944100e33faa2a20cd4c38"} Mar 10 15:09:31 crc kubenswrapper[4743]: W0310 15:09:31.314148 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fb16ec3_618c_4095_a3e7_3f59920d921b.slice/crio-e1effe78e4632c24d58fe7ba06196e0b73c6b659607c92b8058ec8f70763f4d4 WatchSource:0}: Error finding container e1effe78e4632c24d58fe7ba06196e0b73c6b659607c92b8058ec8f70763f4d4: Status 404 returned error can't find the container with id e1effe78e4632c24d58fe7ba06196e0b73c6b659607c92b8058ec8f70763f4d4 Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.321778 4743 generic.go:334] "Generic (PLEG): container finished" podID="a257bbc1-f866-4d43-9011-7ed9bd6d13e9" containerID="28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd" exitCode=0 Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.321849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" event={"ID":"a257bbc1-f866-4d43-9011-7ed9bd6d13e9","Type":"ContainerDied","Data":"28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd"} Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.321875 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" event={"ID":"a257bbc1-f866-4d43-9011-7ed9bd6d13e9","Type":"ContainerDied","Data":"d9523735b51b58083c873df5af265bd1a0e8cd77e443c7477db82f7460b09e33"} Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.321893 4743 scope.go:117] "RemoveContainer" containerID="28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.322014 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.335331 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.335877 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6tgf"] Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.340781 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5ps5r" podStartSLOduration=11.340753287 podStartE2EDuration="11.340753287s" podCreationTimestamp="2026-03-10 15:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:31.330058405 +0000 UTC m=+236.036873153" watchObservedRunningTime="2026-03-10 15:09:31.340753287 +0000 UTC m=+236.047568035" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.372402 4743 generic.go:334] "Generic (PLEG): container finished" podID="e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89" containerID="e5ca210bdab9aeadfaaa4fdaccea09d81610e74728e650ea7841c4c954e71317" exitCode=0 Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.372599 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg"] Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.372636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" event={"ID":"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89","Type":"ContainerDied","Data":"e5ca210bdab9aeadfaaa4fdaccea09d81610e74728e650ea7841c4c954e71317"} Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.379616 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xczzg"] Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.381988 4743 ???:1] "http: TLS handshake error from 192.168.126.11:49324: no serving certificate available for the kubelet" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.403245 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.403574 4743 scope.go:117] "RemoveContainer" containerID="28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd" Mar 10 15:09:31 crc kubenswrapper[4743]: E0310 15:09:31.405718 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd\": container with ID starting with 28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd not found: ID does not exist" containerID="28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.405770 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd"} err="failed to get container status \"28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd\": rpc error: code = NotFound desc = could not find container \"28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd\": container with ID starting with 28c3895bcc8b31be9bf73ffd6699a061becef4c059426065b652c4b312cb24bd not found: ID does not exist" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.462698 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h4q2k"] Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.595422 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mkgdp"] Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.755162 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9"] Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.822825 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mxlth"] Mar 10 15:09:31 crc kubenswrapper[4743]: W0310 15:09:31.863079 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199e5a98_b472_45af_9088_ffe163ceba78.slice/crio-0450bca02e49efd41f9282c9cc97de58da1e5fc154f2db307dc27f87a468cb3e WatchSource:0}: Error finding container 0450bca02e49efd41f9282c9cc97de58da1e5fc154f2db307dc27f87a468cb3e: Status 404 returned error can't find the container with id 0450bca02e49efd41f9282c9cc97de58da1e5fc154f2db307dc27f87a468cb3e Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.930267 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4370e3f8-d9d3-48c8-a10f-d19c28342bb6" path="/var/lib/kubelet/pods/4370e3f8-d9d3-48c8-a10f-d19c28342bb6/volumes" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.931493 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.932207 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a257bbc1-f866-4d43-9011-7ed9bd6d13e9" path="/var/lib/kubelet/pods/a257bbc1-f866-4d43-9011-7ed9bd6d13e9/volumes" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.962115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:31 crc kubenswrapper[4743]: I0310 15:09:31.970341 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acbc8434-7aab-481b-ae0e-08696da082ad-metrics-certs\") pod \"network-metrics-daemon-vcq2w\" (UID: \"acbc8434-7aab-481b-ae0e-08696da082ad\") " pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.036330 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcq2w" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.114871 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:32 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:32 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:32 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.114976 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.255463 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rxzjq"] Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.257017 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.259259 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.269682 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxzjq"] Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.294092 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.294161 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.315936 4743 patch_prober.go:28] interesting pod/console-f9d7485db-v9bc6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.316001 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v9bc6" podUID="bc7402d9-c20f-4429-bda9-db2b1ccddf8e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.352498 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vcq2w"] Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.371314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-utilities\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.371393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hnpv\" (UniqueName: \"kubernetes.io/projected/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-kube-api-access-8hnpv\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.371495 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-catalog-content\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.416362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" event={"ID":"acbc8434-7aab-481b-ae0e-08696da082ad","Type":"ContainerStarted","Data":"1bfb247af32c2d43aca55f082b80ab450e09d7b3d50d577199ba1eec7ea36975"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.418058 4743 generic.go:334] "Generic (PLEG): container finished" podID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerID="7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1" exitCode=0 Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.418632 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qbmf" event={"ID":"5fb16ec3-618c-4095-a3e7-3f59920d921b","Type":"ContainerDied","Data":"7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.418674 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qbmf" event={"ID":"5fb16ec3-618c-4095-a3e7-3f59920d921b","Type":"ContainerStarted","Data":"e1effe78e4632c24d58fe7ba06196e0b73c6b659607c92b8058ec8f70763f4d4"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.433220 4743 generic.go:334] "Generic (PLEG): container finished" podID="d815b880-6675-42e2-8380-3e1aaae065a7" containerID="56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec" exitCode=0 Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.434012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6tgf" event={"ID":"d815b880-6675-42e2-8380-3e1aaae065a7","Type":"ContainerDied","Data":"56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.434118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6tgf" event={"ID":"d815b880-6675-42e2-8380-3e1aaae065a7","Type":"ContainerStarted","Data":"da0735b1e6092952ace9c7df1ba541c9a21b7b47de758285b53ff418e221ccb2"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.438423 4743 generic.go:334] "Generic (PLEG): container finished" podID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerID="77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480" exitCode=0 Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.438693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkgdp" event={"ID":"2b8d4a30-a71d-48ca-b702-772f0e08c566","Type":"ContainerDied","Data":"77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.438732 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkgdp" event={"ID":"2b8d4a30-a71d-48ca-b702-772f0e08c566","Type":"ContainerStarted","Data":"bdfd5d5bc6275aeb664301305fe9c2c9028696c83bcc232ec5e428a200a13a82"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.446075 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec21919b-a512-42f8-b1ce-80498821cb65" containerID="b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e" exitCode=0 Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.446256 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4q2k" event={"ID":"ec21919b-a512-42f8-b1ce-80498821cb65","Type":"ContainerDied","Data":"b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.446346 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4q2k" event={"ID":"ec21919b-a512-42f8-b1ce-80498821cb65","Type":"ContainerStarted","Data":"5a64fccb54b52cc75f6bc55d154d7dcc61cd38cf86ec49a9b953c4d9f21e89b5"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.449867 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.449921 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.480774 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-utilities\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.480837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hnpv\" (UniqueName: \"kubernetes.io/projected/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-kube-api-access-8hnpv\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.480906 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-catalog-content\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.484114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-utilities\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.485118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-catalog-content\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.498239 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" event={"ID":"199e5a98-b472-45af-9088-ffe163ceba78","Type":"ContainerStarted","Data":"be0610a1dd3a80a5a4f87713d35cd0e21d06edc1979c61feb7fa273808d85f6c"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.498288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" event={"ID":"199e5a98-b472-45af-9088-ffe163ceba78","Type":"ContainerStarted","Data":"0450bca02e49efd41f9282c9cc97de58da1e5fc154f2db307dc27f87a468cb3e"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.499332 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.502216 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.519996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" event={"ID":"8cb28af8-7ac7-456b-9439-7fcd96e81f3c","Type":"ContainerStarted","Data":"22f76f686ec753fab2379b26adf8e048b71c2126b5d2a8b44f99ad93995a1eac"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.520081 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" event={"ID":"8cb28af8-7ac7-456b-9439-7fcd96e81f3c","Type":"ContainerStarted","Data":"a729e82fea71cdd1a25222d062fff13d235030ea5420e76d3e6369e0a53904d9"} Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.521689 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.533288 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.575261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hnpv\" (UniqueName: \"kubernetes.io/projected/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-kube-api-access-8hnpv\") pod \"redhat-marketplace-rxzjq\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.577241 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.645595 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" podStartSLOduration=189.64556875 podStartE2EDuration="3m9.64556875s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:32.643128839 +0000 UTC m=+237.349943587" watchObservedRunningTime="2026-03-10 15:09:32.64556875 +0000 UTC m=+237.352383498" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.656637 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sz5nj"] Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.657948 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.678332 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" podStartSLOduration=3.678302044 podStartE2EDuration="3.678302044s" podCreationTimestamp="2026-03-10 15:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:32.671442644 +0000 UTC m=+237.378257392" watchObservedRunningTime="2026-03-10 15:09:32.678302044 +0000 UTC m=+237.385116792" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.687287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcmff\" (UniqueName: \"kubernetes.io/projected/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-kube-api-access-pcmff\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.687980 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-utilities\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.688080 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sz5nj"] Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.688113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-catalog-content\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.703934 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-59rpz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.703967 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-59rpz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" start-of-body= Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.704012 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-59rpz" podUID="ec0a0850-2f3c-4a27-a08c-0820a360ace9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.704045 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-59rpz" podUID="ec0a0850-2f3c-4a27-a08c-0820a360ace9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.45:8080/\": dial tcp 10.217.0.45:8080: connect: connection refused" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.758385 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.758431 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.775098 4743 patch_prober.go:28] interesting pod/apiserver-76f77b778f-658rk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]log ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]etcd ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/max-in-flight-filter ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 10 15:09:32 crc kubenswrapper[4743]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/project.openshift.io-projectcache ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-startinformers ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 10 15:09:32 crc kubenswrapper[4743]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 15:09:32 crc kubenswrapper[4743]: livez check failed Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.775174 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-658rk" podUID="320568c9-bd4b-4ebd-a575-650bcdd5d104" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.790444 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-utilities\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.790608 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-catalog-content\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.790632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcmff\" (UniqueName: \"kubernetes.io/projected/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-kube-api-access-pcmff\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.792047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-catalog-content\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.793119 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-utilities\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.825999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcmff\" (UniqueName: \"kubernetes.io/projected/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-kube-api-access-pcmff\") pod \"redhat-marketplace-sz5nj\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.927664 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.970530 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw"] Mar 10 15:09:32 crc kubenswrapper[4743]: E0310 15:09:32.971460 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89" containerName="collect-profiles" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.971485 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89" containerName="collect-profiles" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.971656 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89" containerName="collect-profiles" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.972398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.978648 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.980129 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.980395 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.980711 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.980836 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.981861 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.987750 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw"] Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.990107 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.993084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-secret-volume\") pod \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.993219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qd6\" (UniqueName: \"kubernetes.io/projected/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-kube-api-access-l7qd6\") pod \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.993310 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume\") pod \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\" (UID: \"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89\") " Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.994964 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume" (OuterVolumeSpecName: "config-volume") pod "e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89" (UID: "e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:32 crc kubenswrapper[4743]: I0310 15:09:32.998859 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89" (UID: "e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.001558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-kube-api-access-l7qd6" (OuterVolumeSpecName: "kube-api-access-l7qd6") pod "e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89" (UID: "e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89"). InnerVolumeSpecName "kube-api-access-l7qd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.004100 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxzjq"] Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.030261 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.031283 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.034434 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.034777 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.069196 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094403 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37714ff0-26cf-4540-9065-82c1dd15885a-serving-cert\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-client-ca\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkn6z\" (UniqueName: \"kubernetes.io/projected/37714ff0-26cf-4540-9065-82c1dd15885a-kube-api-access-mkn6z\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094548 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b5915d5-7a78-4b33-84c8-f7941c659704-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b5915d5-7a78-4b33-84c8-f7941c659704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b5915d5-7a78-4b33-84c8-f7941c659704-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b5915d5-7a78-4b33-84c8-f7941c659704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094609 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-config\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094685 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094698 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.094707 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qd6\" (UniqueName: \"kubernetes.io/projected/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89-kube-api-access-l7qd6\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.111236 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.115823 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:33 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:33 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:33 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.115884 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.196209 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37714ff0-26cf-4540-9065-82c1dd15885a-serving-cert\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.196320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-client-ca\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.196414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkn6z\" (UniqueName: \"kubernetes.io/projected/37714ff0-26cf-4540-9065-82c1dd15885a-kube-api-access-mkn6z\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.196563 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b5915d5-7a78-4b33-84c8-f7941c659704-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b5915d5-7a78-4b33-84c8-f7941c659704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.196613 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b5915d5-7a78-4b33-84c8-f7941c659704-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b5915d5-7a78-4b33-84c8-f7941c659704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.196667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-config\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.197545 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b5915d5-7a78-4b33-84c8-f7941c659704-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b5915d5-7a78-4b33-84c8-f7941c659704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.209361 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37714ff0-26cf-4540-9065-82c1dd15885a-serving-cert\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.209446 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-config\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.211535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-client-ca\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.215746 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b5915d5-7a78-4b33-84c8-f7941c659704-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b5915d5-7a78-4b33-84c8-f7941c659704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.224436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkn6z\" (UniqueName: \"kubernetes.io/projected/37714ff0-26cf-4540-9065-82c1dd15885a-kube-api-access-mkn6z\") pod \"route-controller-manager-6b977996dd-lbvvw\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.304691 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.389757 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.510212 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.551762 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sz5nj"] Mar 10 15:09:33 crc kubenswrapper[4743]: W0310 15:09:33.593475 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8830b2a3_e8f3_48d0_85ca_5c936ca0a04c.slice/crio-1357b05974f924dd542acf59bcdefb3de38e9a1e72760b7c184cb9df6fcd0ded WatchSource:0}: Error finding container 1357b05974f924dd542acf59bcdefb3de38e9a1e72760b7c184cb9df6fcd0ded: Status 404 returned error can't find the container with id 1357b05974f924dd542acf59bcdefb3de38e9a1e72760b7c184cb9df6fcd0ded Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.611433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" event={"ID":"e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89","Type":"ContainerDied","Data":"ac8cf216071eb114f067243d0f5bd94eead414349c06d11190bcdecf9fc94245"} Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.611604 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac8cf216071eb114f067243d0f5bd94eead414349c06d11190bcdecf9fc94245" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.611717 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.621031 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" event={"ID":"acbc8434-7aab-481b-ae0e-08696da082ad","Type":"ContainerStarted","Data":"72a56e15ff1134fb12a3bb63cab3db6ec3cff3120377638ec85ffadee979744a"} Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.621090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vcq2w" event={"ID":"acbc8434-7aab-481b-ae0e-08696da082ad","Type":"ContainerStarted","Data":"6c9767e007d6144654d32dcbfe09987832e67e9a37ee61db2aee03f3fee5d0a8"} Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.626987 4743 generic.go:334] "Generic (PLEG): container finished" podID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerID="2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad" exitCode=0 Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.627051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxzjq" event={"ID":"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c","Type":"ContainerDied","Data":"2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad"} Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.627287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxzjq" event={"ID":"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c","Type":"ContainerStarted","Data":"4d83bcaa8e40a02245c4ee0eb2fad34f8c74f1805788c3a5b470868a44c1a72d"} Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.638024 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hdwfr" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.678323 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vcq2w" podStartSLOduration=190.678296381 podStartE2EDuration="3m10.678296381s" podCreationTimestamp="2026-03-10 15:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:33.655089114 +0000 UTC m=+238.361903862" watchObservedRunningTime="2026-03-10 15:09:33.678296381 +0000 UTC m=+238.385111129" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.688801 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2hq6n"] Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.707068 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hq6n"] Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.708965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.715247 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.808287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-utilities\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.808352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-catalog-content\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.808370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d58x2\" (UniqueName: \"kubernetes.io/projected/0eb81c77-0afe-417e-a904-90a76e45f309-kube-api-access-d58x2\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.864663 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw"] Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.909663 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-catalog-content\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.909706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d58x2\" (UniqueName: \"kubernetes.io/projected/0eb81c77-0afe-417e-a904-90a76e45f309-kube-api-access-d58x2\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.909733 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-utilities\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.910420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-catalog-content\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.911040 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-utilities\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:33 crc kubenswrapper[4743]: I0310 15:09:33.957458 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d58x2\" (UniqueName: \"kubernetes.io/projected/0eb81c77-0afe-417e-a904-90a76e45f309-kube-api-access-d58x2\") pod \"redhat-operators-2hq6n\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.051757 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.057882 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cv2mc"] Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.061359 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.089287 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cv2mc"] Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.106132 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.131001 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:34 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:34 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:34 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.131554 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.231730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzxhn\" (UniqueName: \"kubernetes.io/projected/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-kube-api-access-mzxhn\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.231842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-utilities\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.231869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-catalog-content\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.279781 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.280862 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.283832 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.295540 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.306002 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.340338 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzxhn\" (UniqueName: \"kubernetes.io/projected/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-kube-api-access-mzxhn\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.340512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-utilities\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.340592 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-catalog-content\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.341334 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-catalog-content\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.343610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-utilities\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.379348 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzxhn\" (UniqueName: \"kubernetes.io/projected/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-kube-api-access-mzxhn\") pod \"redhat-operators-cv2mc\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.421768 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.441718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.442188 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.526045 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hq6n"] Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.544186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.544300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.544432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.569041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.657573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.733286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b5915d5-7a78-4b33-84c8-f7941c659704","Type":"ContainerStarted","Data":"a19578d913066c597ca739c01386d50c67e09d7420c6cd670ae70241840a98dd"} Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.754398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" event={"ID":"37714ff0-26cf-4540-9065-82c1dd15885a","Type":"ContainerStarted","Data":"2af29dd19cd2a016c36592195126d15bfe6fbce40a6df22cbdf75b46987b92e8"} Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.754441 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" event={"ID":"37714ff0-26cf-4540-9065-82c1dd15885a","Type":"ContainerStarted","Data":"7aee4920c262237ecbb92b6674319f4023fdbca5615d044538000fdee7c6279c"} Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.754945 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.775936 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" podStartSLOduration=5.775898782 podStartE2EDuration="5.775898782s" podCreationTimestamp="2026-03-10 15:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:34.775774588 +0000 UTC m=+239.482589356" watchObservedRunningTime="2026-03-10 15:09:34.775898782 +0000 UTC m=+239.482713550" Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.781483 4743 generic.go:334] "Generic (PLEG): container finished" podID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerID="1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063" exitCode=0 Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.781694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sz5nj" event={"ID":"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c","Type":"ContainerDied","Data":"1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063"} Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.781765 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sz5nj" event={"ID":"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c","Type":"ContainerStarted","Data":"1357b05974f924dd542acf59bcdefb3de38e9a1e72760b7c184cb9df6fcd0ded"} Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.789071 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cv2mc"] Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.798148 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hq6n" event={"ID":"0eb81c77-0afe-417e-a904-90a76e45f309","Type":"ContainerStarted","Data":"48fc5507d84a7ed5f994d2c8cc0aedd3133c0c9d9a12213984187744328afb84"} Mar 10 15:09:34 crc kubenswrapper[4743]: I0310 15:09:34.989688 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:09:35 crc kubenswrapper[4743]: W0310 15:09:35.071087 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf4933d7e_51c1_4a44_a774_d2ee26dd7c88.slice/crio-b14a55e1a75ba64f2b3a73605f7c4865d6f62e278f52f8b5a3a327d5a327c68e WatchSource:0}: Error finding container b14a55e1a75ba64f2b3a73605f7c4865d6f62e278f52f8b5a3a327d5a327c68e: Status 404 returned error can't find the container with id b14a55e1a75ba64f2b3a73605f7c4865d6f62e278f52f8b5a3a327d5a327c68e Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.115132 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:35 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:35 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:35 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.115187 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.198188 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.608051 4743 ???:1] "http: TLS handshake error from 192.168.126.11:49336: no serving certificate available for the kubelet" Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.806996 4743 generic.go:334] "Generic (PLEG): container finished" podID="0eb81c77-0afe-417e-a904-90a76e45f309" containerID="39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9" exitCode=0 Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.807091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hq6n" event={"ID":"0eb81c77-0afe-417e-a904-90a76e45f309","Type":"ContainerDied","Data":"39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9"} Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.812269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f4933d7e-51c1-4a44-a774-d2ee26dd7c88","Type":"ContainerStarted","Data":"b14a55e1a75ba64f2b3a73605f7c4865d6f62e278f52f8b5a3a327d5a327c68e"} Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.818964 4743 generic.go:334] "Generic (PLEG): container finished" podID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerID="409b90acd45605582c37c1e315cffbf440b459ef5a59c628e3f1c62dffd5ebc7" exitCode=0 Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.819111 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv2mc" event={"ID":"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493","Type":"ContainerDied","Data":"409b90acd45605582c37c1e315cffbf440b459ef5a59c628e3f1c62dffd5ebc7"} Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.819198 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv2mc" event={"ID":"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493","Type":"ContainerStarted","Data":"763a304f28c802381bad3b010491352be3204447a7871ea4905bf223c4c2d83c"} Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.822446 4743 generic.go:334] "Generic (PLEG): container finished" podID="0b5915d5-7a78-4b33-84c8-f7941c659704" containerID="d9fc02e566d9c7f382eb4b0de807f20be2f4a163b4f8ed58e9ff349828981d3b" exitCode=0 Mar 10 15:09:35 crc kubenswrapper[4743]: I0310 15:09:35.822791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b5915d5-7a78-4b33-84c8-f7941c659704","Type":"ContainerDied","Data":"d9fc02e566d9c7f382eb4b0de807f20be2f4a163b4f8ed58e9ff349828981d3b"} Mar 10 15:09:36 crc kubenswrapper[4743]: I0310 15:09:36.118019 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:36 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:36 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:36 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:36 crc kubenswrapper[4743]: I0310 15:09:36.118665 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:36 crc kubenswrapper[4743]: I0310 15:09:36.554517 4743 ???:1] "http: TLS handshake error from 192.168.126.11:49352: no serving certificate available for the kubelet" Mar 10 15:09:36 crc kubenswrapper[4743]: I0310 15:09:36.858985 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4933d7e-51c1-4a44-a774-d2ee26dd7c88" containerID="c6ca58da72b0c4f47d50825bd49caae7d3689435dde9d0e6dea27e8704ba7769" exitCode=0 Mar 10 15:09:36 crc kubenswrapper[4743]: I0310 15:09:36.859091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f4933d7e-51c1-4a44-a774-d2ee26dd7c88","Type":"ContainerDied","Data":"c6ca58da72b0c4f47d50825bd49caae7d3689435dde9d0e6dea27e8704ba7769"} Mar 10 15:09:37 crc kubenswrapper[4743]: I0310 15:09:37.114453 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:37 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:37 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:37 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:37 crc kubenswrapper[4743]: I0310 15:09:37.114965 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:37 crc kubenswrapper[4743]: I0310 15:09:37.764190 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:37 crc kubenswrapper[4743]: I0310 15:09:37.789069 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-658rk" Mar 10 15:09:38 crc kubenswrapper[4743]: I0310 15:09:38.114269 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:38 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:38 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:38 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:38 crc kubenswrapper[4743]: I0310 15:09:38.114344 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:38 crc kubenswrapper[4743]: I0310 15:09:38.329997 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wxcl8" Mar 10 15:09:39 crc kubenswrapper[4743]: I0310 15:09:39.113484 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:39 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:39 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:39 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:39 crc kubenswrapper[4743]: I0310 15:09:39.114125 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:40 crc kubenswrapper[4743]: I0310 15:09:40.119530 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:40 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:40 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:40 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:40 crc kubenswrapper[4743]: I0310 15:09:40.119619 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:41 crc kubenswrapper[4743]: I0310 15:09:41.117469 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:41 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:41 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:41 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:41 crc kubenswrapper[4743]: I0310 15:09:41.118015 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:41 crc kubenswrapper[4743]: I0310 15:09:41.259109 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:09:41 crc kubenswrapper[4743]: I0310 15:09:41.259188 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:09:42 crc kubenswrapper[4743]: I0310 15:09:42.113202 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:42 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:42 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:42 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:42 crc kubenswrapper[4743]: I0310 15:09:42.113281 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:42 crc kubenswrapper[4743]: I0310 15:09:42.293131 4743 patch_prober.go:28] interesting pod/console-f9d7485db-v9bc6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 10 15:09:42 crc kubenswrapper[4743]: I0310 15:09:42.293200 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v9bc6" podUID="bc7402d9-c20f-4429-bda9-db2b1ccddf8e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 10 15:09:42 crc kubenswrapper[4743]: I0310 15:09:42.710701 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-59rpz" Mar 10 15:09:43 crc kubenswrapper[4743]: I0310 15:09:43.115093 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:43 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:43 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:43 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:43 crc kubenswrapper[4743]: I0310 15:09:43.115193 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:44 crc kubenswrapper[4743]: I0310 15:09:44.113911 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:44 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Mar 10 15:09:44 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:44 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:44 crc kubenswrapper[4743]: I0310 15:09:44.114011 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.112418 4743 patch_prober.go:28] interesting pod/router-default-5444994796-nsqd8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:45 crc kubenswrapper[4743]: [+]has-synced ok Mar 10 15:09:45 crc kubenswrapper[4743]: [+]process-running ok Mar 10 15:09:45 crc kubenswrapper[4743]: healthz check failed Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.112514 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nsqd8" podUID="1da9c1ba-ea6b-41e2-9e9b-8c7876ca7c48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.856205 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.866029 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.974958 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.974951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f4933d7e-51c1-4a44-a774-d2ee26dd7c88","Type":"ContainerDied","Data":"b14a55e1a75ba64f2b3a73605f7c4865d6f62e278f52f8b5a3a327d5a327c68e"} Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.975098 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b14a55e1a75ba64f2b3a73605f7c4865d6f62e278f52f8b5a3a327d5a327c68e" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.976642 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b5915d5-7a78-4b33-84c8-f7941c659704","Type":"ContainerDied","Data":"a19578d913066c597ca739c01386d50c67e09d7420c6cd670ae70241840a98dd"} Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.976683 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a19578d913066c597ca739c01386d50c67e09d7420c6cd670ae70241840a98dd" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.976728 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.979225 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kube-api-access\") pod \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\" (UID: \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\") " Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.979276 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b5915d5-7a78-4b33-84c8-f7941c659704-kubelet-dir\") pod \"0b5915d5-7a78-4b33-84c8-f7941c659704\" (UID: \"0b5915d5-7a78-4b33-84c8-f7941c659704\") " Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.979302 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kubelet-dir\") pod \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\" (UID: \"f4933d7e-51c1-4a44-a774-d2ee26dd7c88\") " Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.979401 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b5915d5-7a78-4b33-84c8-f7941c659704-kube-api-access\") pod \"0b5915d5-7a78-4b33-84c8-f7941c659704\" (UID: \"0b5915d5-7a78-4b33-84c8-f7941c659704\") " Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.979409 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b5915d5-7a78-4b33-84c8-f7941c659704-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b5915d5-7a78-4b33-84c8-f7941c659704" (UID: "0b5915d5-7a78-4b33-84c8-f7941c659704"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.979470 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f4933d7e-51c1-4a44-a774-d2ee26dd7c88" (UID: "f4933d7e-51c1-4a44-a774-d2ee26dd7c88"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.979772 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b5915d5-7a78-4b33-84c8-f7941c659704-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.979798 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.986911 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5915d5-7a78-4b33-84c8-f7941c659704-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b5915d5-7a78-4b33-84c8-f7941c659704" (UID: "0b5915d5-7a78-4b33-84c8-f7941c659704"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:45 crc kubenswrapper[4743]: I0310 15:09:45.987345 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f4933d7e-51c1-4a44-a774-d2ee26dd7c88" (UID: "f4933d7e-51c1-4a44-a774-d2ee26dd7c88"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:46 crc kubenswrapper[4743]: I0310 15:09:46.081678 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b5915d5-7a78-4b33-84c8-f7941c659704-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:46 crc kubenswrapper[4743]: I0310 15:09:46.081715 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4933d7e-51c1-4a44-a774-d2ee26dd7c88-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:46 crc kubenswrapper[4743]: I0310 15:09:46.113162 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:46 crc kubenswrapper[4743]: I0310 15:09:46.116280 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nsqd8" Mar 10 15:09:48 crc kubenswrapper[4743]: I0310 15:09:48.604297 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9"] Mar 10 15:09:48 crc kubenswrapper[4743]: I0310 15:09:48.605152 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" podUID="8cb28af8-7ac7-456b-9439-7fcd96e81f3c" containerName="controller-manager" containerID="cri-o://22f76f686ec753fab2379b26adf8e048b71c2126b5d2a8b44f99ad93995a1eac" gracePeriod=30 Mar 10 15:09:48 crc kubenswrapper[4743]: I0310 15:09:48.617498 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw"] Mar 10 15:09:48 crc kubenswrapper[4743]: I0310 15:09:48.617884 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" podUID="37714ff0-26cf-4540-9065-82c1dd15885a" containerName="route-controller-manager" containerID="cri-o://2af29dd19cd2a016c36592195126d15bfe6fbce40a6df22cbdf75b46987b92e8" gracePeriod=30 Mar 10 15:09:51 crc kubenswrapper[4743]: I0310 15:09:51.009710 4743 generic.go:334] "Generic (PLEG): container finished" podID="8cb28af8-7ac7-456b-9439-7fcd96e81f3c" containerID="22f76f686ec753fab2379b26adf8e048b71c2126b5d2a8b44f99ad93995a1eac" exitCode=0 Mar 10 15:09:51 crc kubenswrapper[4743]: I0310 15:09:51.009808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" event={"ID":"8cb28af8-7ac7-456b-9439-7fcd96e81f3c","Type":"ContainerDied","Data":"22f76f686ec753fab2379b26adf8e048b71c2126b5d2a8b44f99ad93995a1eac"} Mar 10 15:09:51 crc kubenswrapper[4743]: I0310 15:09:51.012959 4743 generic.go:334] "Generic (PLEG): container finished" podID="37714ff0-26cf-4540-9065-82c1dd15885a" containerID="2af29dd19cd2a016c36592195126d15bfe6fbce40a6df22cbdf75b46987b92e8" exitCode=0 Mar 10 15:09:51 crc kubenswrapper[4743]: I0310 15:09:51.012990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" event={"ID":"37714ff0-26cf-4540-9065-82c1dd15885a","Type":"ContainerDied","Data":"2af29dd19cd2a016c36592195126d15bfe6fbce40a6df22cbdf75b46987b92e8"} Mar 10 15:09:51 crc kubenswrapper[4743]: I0310 15:09:51.409219 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.155112 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.193197 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74648876b-6z5s9"] Mar 10 15:09:52 crc kubenswrapper[4743]: E0310 15:09:52.193499 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb28af8-7ac7-456b-9439-7fcd96e81f3c" containerName="controller-manager" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.193513 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb28af8-7ac7-456b-9439-7fcd96e81f3c" containerName="controller-manager" Mar 10 15:09:52 crc kubenswrapper[4743]: E0310 15:09:52.193537 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5915d5-7a78-4b33-84c8-f7941c659704" containerName="pruner" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.193545 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5915d5-7a78-4b33-84c8-f7941c659704" containerName="pruner" Mar 10 15:09:52 crc kubenswrapper[4743]: E0310 15:09:52.193560 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4933d7e-51c1-4a44-a774-d2ee26dd7c88" containerName="pruner" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.193568 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4933d7e-51c1-4a44-a774-d2ee26dd7c88" containerName="pruner" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.193677 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4933d7e-51c1-4a44-a774-d2ee26dd7c88" containerName="pruner" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.193694 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb28af8-7ac7-456b-9439-7fcd96e81f3c" containerName="controller-manager" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.193704 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5915d5-7a78-4b33-84c8-f7941c659704" containerName="pruner" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.194234 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.202350 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74648876b-6z5s9"] Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.221899 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-serving-cert\") pod \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.221989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-config\") pod \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.222059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lmhj\" (UniqueName: \"kubernetes.io/projected/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-kube-api-access-8lmhj\") pod \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.222090 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-proxy-ca-bundles\") pod \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.222117 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-client-ca\") pod \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\" (UID: \"8cb28af8-7ac7-456b-9439-7fcd96e81f3c\") " Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.222598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-proxy-ca-bundles\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.222630 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c56fd26-496a-4d08-8e5f-32cec30e295e-serving-cert\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.222657 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fp7j\" (UniqueName: \"kubernetes.io/projected/0c56fd26-496a-4d08-8e5f-32cec30e295e-kube-api-access-6fp7j\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.222693 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-config\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.222741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-client-ca\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.224425 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-config" (OuterVolumeSpecName: "config") pod "8cb28af8-7ac7-456b-9439-7fcd96e81f3c" (UID: "8cb28af8-7ac7-456b-9439-7fcd96e81f3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.226728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8cb28af8-7ac7-456b-9439-7fcd96e81f3c" (UID: "8cb28af8-7ac7-456b-9439-7fcd96e81f3c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.227007 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-client-ca" (OuterVolumeSpecName: "client-ca") pod "8cb28af8-7ac7-456b-9439-7fcd96e81f3c" (UID: "8cb28af8-7ac7-456b-9439-7fcd96e81f3c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.231074 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cb28af8-7ac7-456b-9439-7fcd96e81f3c" (UID: "8cb28af8-7ac7-456b-9439-7fcd96e81f3c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.231426 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-kube-api-access-8lmhj" (OuterVolumeSpecName: "kube-api-access-8lmhj") pod "8cb28af8-7ac7-456b-9439-7fcd96e81f3c" (UID: "8cb28af8-7ac7-456b-9439-7fcd96e81f3c"). InnerVolumeSpecName "kube-api-access-8lmhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.296766 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.300904 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.323800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-proxy-ca-bundles\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.323867 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c56fd26-496a-4d08-8e5f-32cec30e295e-serving-cert\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.323898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fp7j\" (UniqueName: \"kubernetes.io/projected/0c56fd26-496a-4d08-8e5f-32cec30e295e-kube-api-access-6fp7j\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.323937 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-config\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.323990 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-client-ca\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.324188 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lmhj\" (UniqueName: \"kubernetes.io/projected/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-kube-api-access-8lmhj\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.324205 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.324218 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.324231 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.324244 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb28af8-7ac7-456b-9439-7fcd96e81f3c-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.326301 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-client-ca\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.327622 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-proxy-ca-bundles\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.327785 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-config\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.345240 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c56fd26-496a-4d08-8e5f-32cec30e295e-serving-cert\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.345398 4743 patch_prober.go:28] interesting pod/controller-manager-7cb6d674bb-g9sc9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.345464 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" podUID="8cb28af8-7ac7-456b-9439-7fcd96e81f3c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.354477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fp7j\" (UniqueName: \"kubernetes.io/projected/0c56fd26-496a-4d08-8e5f-32cec30e295e-kube-api-access-6fp7j\") pod \"controller-manager-74648876b-6z5s9\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:52 crc kubenswrapper[4743]: I0310 15:09:52.564138 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.029447 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.029439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9" event={"ID":"8cb28af8-7ac7-456b-9439-7fcd96e81f3c","Type":"ContainerDied","Data":"a729e82fea71cdd1a25222d062fff13d235030ea5420e76d3e6369e0a53904d9"} Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.029627 4743 scope.go:117] "RemoveContainer" containerID="22f76f686ec753fab2379b26adf8e048b71c2126b5d2a8b44f99ad93995a1eac" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.066477 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9"] Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.070045 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cb6d674bb-g9sc9"] Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.442185 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:53 crc kubenswrapper[4743]: E0310 15:09:53.542990 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 15:09:53 crc kubenswrapper[4743]: E0310 15:09:53.543598 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:09:53 crc kubenswrapper[4743]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 15:09:53 crc kubenswrapper[4743]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lqfxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552588-mtmtv_openshift-infra(7837cec9-3686-497f-b9ec-2525768cd8ce): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 15:09:53 crc kubenswrapper[4743]: > logger="UnhandledError" Mar 10 15:09:53 crc kubenswrapper[4743]: E0310 15:09:53.545331 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" podUID="7837cec9-3686-497f-b9ec-2525768cd8ce" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.545704 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-config\") pod \"37714ff0-26cf-4540-9065-82c1dd15885a\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.545858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkn6z\" (UniqueName: \"kubernetes.io/projected/37714ff0-26cf-4540-9065-82c1dd15885a-kube-api-access-mkn6z\") pod \"37714ff0-26cf-4540-9065-82c1dd15885a\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.546734 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-config" (OuterVolumeSpecName: "config") pod "37714ff0-26cf-4540-9065-82c1dd15885a" (UID: "37714ff0-26cf-4540-9065-82c1dd15885a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.546824 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-client-ca\") pod \"37714ff0-26cf-4540-9065-82c1dd15885a\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.546900 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37714ff0-26cf-4540-9065-82c1dd15885a-serving-cert\") pod \"37714ff0-26cf-4540-9065-82c1dd15885a\" (UID: \"37714ff0-26cf-4540-9065-82c1dd15885a\") " Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.547491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-client-ca" (OuterVolumeSpecName: "client-ca") pod "37714ff0-26cf-4540-9065-82c1dd15885a" (UID: "37714ff0-26cf-4540-9065-82c1dd15885a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.548670 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.548693 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37714ff0-26cf-4540-9065-82c1dd15885a-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.552438 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37714ff0-26cf-4540-9065-82c1dd15885a-kube-api-access-mkn6z" (OuterVolumeSpecName: "kube-api-access-mkn6z") pod "37714ff0-26cf-4540-9065-82c1dd15885a" (UID: "37714ff0-26cf-4540-9065-82c1dd15885a"). InnerVolumeSpecName "kube-api-access-mkn6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.552563 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37714ff0-26cf-4540-9065-82c1dd15885a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "37714ff0-26cf-4540-9065-82c1dd15885a" (UID: "37714ff0-26cf-4540-9065-82c1dd15885a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.650565 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37714ff0-26cf-4540-9065-82c1dd15885a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.650621 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkn6z\" (UniqueName: \"kubernetes.io/projected/37714ff0-26cf-4540-9065-82c1dd15885a-kube-api-access-mkn6z\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:53 crc kubenswrapper[4743]: I0310 15:09:53.924479 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb28af8-7ac7-456b-9439-7fcd96e81f3c" path="/var/lib/kubelet/pods/8cb28af8-7ac7-456b-9439-7fcd96e81f3c/volumes" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.038399 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" event={"ID":"37714ff0-26cf-4540-9065-82c1dd15885a","Type":"ContainerDied","Data":"7aee4920c262237ecbb92b6674319f4023fdbca5615d044538000fdee7c6279c"} Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.038454 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" Mar 10 15:09:54 crc kubenswrapper[4743]: E0310 15:09:54.041196 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" podUID="7837cec9-3686-497f-b9ec-2525768cd8ce" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.126704 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw"] Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.138785 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw"] Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.305707 4743 patch_prober.go:28] interesting pod/route-controller-manager-6b977996dd-lbvvw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.305786 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b977996dd-lbvvw" podUID="37714ff0-26cf-4540-9065-82c1dd15885a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.985408 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq"] Mar 10 15:09:54 crc kubenswrapper[4743]: E0310 15:09:54.985643 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37714ff0-26cf-4540-9065-82c1dd15885a" containerName="route-controller-manager" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.985658 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="37714ff0-26cf-4540-9065-82c1dd15885a" containerName="route-controller-manager" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.985773 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="37714ff0-26cf-4540-9065-82c1dd15885a" containerName="route-controller-manager" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.986685 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.992372 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.992467 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.992608 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.992663 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.992727 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.992781 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:09:54 crc kubenswrapper[4743]: I0310 15:09:54.998687 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq"] Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.077799 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-config\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.179593 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gwb\" (UniqueName: \"kubernetes.io/projected/2ca466c4-d03d-48cc-9953-17f89bb0e48d-kube-api-access-c6gwb\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.179658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-client-ca\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.179732 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-config\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.179763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca466c4-d03d-48cc-9953-17f89bb0e48d-serving-cert\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.345023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-config\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.347582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca466c4-d03d-48cc-9953-17f89bb0e48d-serving-cert\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.349552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gwb\" (UniqueName: \"kubernetes.io/projected/2ca466c4-d03d-48cc-9953-17f89bb0e48d-kube-api-access-c6gwb\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.349640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-client-ca\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.350898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-client-ca\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.356599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca466c4-d03d-48cc-9953-17f89bb0e48d-serving-cert\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.373861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gwb\" (UniqueName: \"kubernetes.io/projected/2ca466c4-d03d-48cc-9953-17f89bb0e48d-kube-api-access-c6gwb\") pod \"route-controller-manager-5fd4874df9-cjtbq\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.618314 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:09:55 crc kubenswrapper[4743]: I0310 15:09:55.933512 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37714ff0-26cf-4540-9065-82c1dd15885a" path="/var/lib/kubelet/pods/37714ff0-26cf-4540-9065-82c1dd15885a/volumes" Mar 10 15:09:57 crc kubenswrapper[4743]: I0310 15:09:57.064918 4743 ???:1] "http: TLS handshake error from 192.168.126.11:33628: no serving certificate available for the kubelet" Mar 10 15:10:00 crc kubenswrapper[4743]: I0310 15:10:00.136587 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2bvk6"] Mar 10 15:10:00 crc kubenswrapper[4743]: I0310 15:10:00.138322 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-2bvk6" Mar 10 15:10:00 crc kubenswrapper[4743]: I0310 15:10:00.141372 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:10:00 crc kubenswrapper[4743]: I0310 15:10:00.148118 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2bvk6"] Mar 10 15:10:00 crc kubenswrapper[4743]: I0310 15:10:00.228099 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snn7g\" (UniqueName: \"kubernetes.io/projected/ac6ef377-422d-42a7-aedb-5adad149a2bc-kube-api-access-snn7g\") pod \"auto-csr-approver-29552590-2bvk6\" (UID: \"ac6ef377-422d-42a7-aedb-5adad149a2bc\") " pod="openshift-infra/auto-csr-approver-29552590-2bvk6" Mar 10 15:10:00 crc kubenswrapper[4743]: I0310 15:10:00.330168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snn7g\" (UniqueName: \"kubernetes.io/projected/ac6ef377-422d-42a7-aedb-5adad149a2bc-kube-api-access-snn7g\") pod \"auto-csr-approver-29552590-2bvk6\" (UID: \"ac6ef377-422d-42a7-aedb-5adad149a2bc\") " pod="openshift-infra/auto-csr-approver-29552590-2bvk6" Mar 10 15:10:00 crc kubenswrapper[4743]: I0310 15:10:00.352718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snn7g\" (UniqueName: \"kubernetes.io/projected/ac6ef377-422d-42a7-aedb-5adad149a2bc-kube-api-access-snn7g\") pod \"auto-csr-approver-29552590-2bvk6\" (UID: \"ac6ef377-422d-42a7-aedb-5adad149a2bc\") " pod="openshift-infra/auto-csr-approver-29552590-2bvk6" Mar 10 15:10:00 crc kubenswrapper[4743]: I0310 15:10:00.459018 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-2bvk6" Mar 10 15:10:01 crc kubenswrapper[4743]: I0310 15:10:01.418088 4743 scope.go:117] "RemoveContainer" containerID="2af29dd19cd2a016c36592195126d15bfe6fbce40a6df22cbdf75b46987b92e8" Mar 10 15:10:01 crc kubenswrapper[4743]: E0310 15:10:01.510939 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 15:10:01 crc kubenswrapper[4743]: E0310 15:10:01.511134 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s9z62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mkgdp_openshift-marketplace(2b8d4a30-a71d-48ca-b702-772f0e08c566): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:01 crc kubenswrapper[4743]: E0310 15:10:01.512388 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mkgdp" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" Mar 10 15:10:02 crc kubenswrapper[4743]: I0310 15:10:02.770234 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-76lwx" Mar 10 15:10:03 crc kubenswrapper[4743]: E0310 15:10:03.030201 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mkgdp" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" Mar 10 15:10:03 crc kubenswrapper[4743]: E0310 15:10:03.108651 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 15:10:03 crc kubenswrapper[4743]: E0310 15:10:03.109375 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gzd7j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8qbmf_openshift-marketplace(5fb16ec3-618c-4095-a3e7-3f59920d921b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:03 crc kubenswrapper[4743]: E0310 15:10:03.111049 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8qbmf" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" Mar 10 15:10:03 crc kubenswrapper[4743]: E0310 15:10:03.158049 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 15:10:03 crc kubenswrapper[4743]: E0310 15:10:03.158311 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lq52p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-h4q2k_openshift-marketplace(ec21919b-a512-42f8-b1ce-80498821cb65): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:03 crc kubenswrapper[4743]: E0310 15:10:03.159505 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-h4q2k" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" Mar 10 15:10:05 crc kubenswrapper[4743]: I0310 15:10:05.911590 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2lf6"] Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.270644 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.271659 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.274117 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.274543 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.326433 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.374644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.374949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.476286 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.476436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.476467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.501372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:06 crc kubenswrapper[4743]: I0310 15:10:06.644099 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:07 crc kubenswrapper[4743]: E0310 15:10:07.700976 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8qbmf" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" Mar 10 15:10:07 crc kubenswrapper[4743]: E0310 15:10:07.701079 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-h4q2k" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" Mar 10 15:10:08 crc kubenswrapper[4743]: I0310 15:10:08.311962 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74648876b-6z5s9"] Mar 10 15:10:08 crc kubenswrapper[4743]: I0310 15:10:08.327923 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq"] Mar 10 15:10:08 crc kubenswrapper[4743]: I0310 15:10:08.426465 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:10:08 crc kubenswrapper[4743]: I0310 15:10:08.429955 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2bvk6"] Mar 10 15:10:08 crc kubenswrapper[4743]: I0310 15:10:08.567165 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74648876b-6z5s9"] Mar 10 15:10:08 crc kubenswrapper[4743]: E0310 15:10:08.596314 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 15:10:08 crc kubenswrapper[4743]: E0310 15:10:08.596502 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzxhn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cv2mc_openshift-marketplace(5faf25f9-ab4b-468e-b1e9-71a0d8e9e493): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:08 crc kubenswrapper[4743]: E0310 15:10:08.598199 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cv2mc" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" Mar 10 15:10:08 crc kubenswrapper[4743]: I0310 15:10:08.663390 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq"] Mar 10 15:10:09 crc kubenswrapper[4743]: I0310 15:10:09.141157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" event={"ID":"2ca466c4-d03d-48cc-9953-17f89bb0e48d","Type":"ContainerStarted","Data":"1700b2d50c98bb47526b7a66c939f2f2125ec48758838f10cb3d94295b0d75d3"} Mar 10 15:10:09 crc kubenswrapper[4743]: I0310 15:10:09.142870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e90f9844-e6c0-438e-8ad4-ee2a3916870e","Type":"ContainerStarted","Data":"eb0021500d47029fde79196749cd05d67e626e78748f512e6f6ad3beb97d855d"} Mar 10 15:10:09 crc kubenswrapper[4743]: I0310 15:10:09.144498 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" event={"ID":"0c56fd26-496a-4d08-8e5f-32cec30e295e","Type":"ContainerStarted","Data":"092357e286cd381d9aeacca8b54fadfb9956c3028563ef43e3c79334d705c86e"} Mar 10 15:10:09 crc kubenswrapper[4743]: I0310 15:10:09.147365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-2bvk6" event={"ID":"ac6ef377-422d-42a7-aedb-5adad149a2bc","Type":"ContainerStarted","Data":"0440b3f681f3581adec723053d15db07b272127c41e25919774e935ea50c7989"} Mar 10 15:10:09 crc kubenswrapper[4743]: E0310 15:10:09.442415 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cv2mc" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.154930 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxzjq" event={"ID":"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c","Type":"ContainerStarted","Data":"11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80"} Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.158942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" event={"ID":"2ca466c4-d03d-48cc-9953-17f89bb0e48d","Type":"ContainerStarted","Data":"4458862cd113abf3aa4087a7c6ee8df4a203c56856509d648a53f71c71153364"} Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.159052 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.158983 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" podUID="2ca466c4-d03d-48cc-9953-17f89bb0e48d" containerName="route-controller-manager" containerID="cri-o://4458862cd113abf3aa4087a7c6ee8df4a203c56856509d648a53f71c71153364" gracePeriod=30 Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.164491 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" event={"ID":"7837cec9-3686-497f-b9ec-2525768cd8ce","Type":"ContainerStarted","Data":"3d3978e5601da2327dd3b88d62099deb80fe00f1900a29a2184ec135db7f1b97"} Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.171177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e90f9844-e6c0-438e-8ad4-ee2a3916870e","Type":"ContainerStarted","Data":"a403c9df40f2632a67a4436c75cf3d1c6eb7a537bbaad47af53e6c28a3274060"} Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.173620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" event={"ID":"0c56fd26-496a-4d08-8e5f-32cec30e295e","Type":"ContainerStarted","Data":"21f6efca8bc566fb8c37e448181b2da222d7a072918e652102dae86ae1be4fd8"} Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.173679 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.173676 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" podUID="0c56fd26-496a-4d08-8e5f-32cec30e295e" containerName="controller-manager" containerID="cri-o://21f6efca8bc566fb8c37e448181b2da222d7a072918e652102dae86ae1be4fd8" gracePeriod=30 Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.179474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hq6n" event={"ID":"0eb81c77-0afe-417e-a904-90a76e45f309","Type":"ContainerStarted","Data":"45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a"} Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.179593 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.188238 4743 generic.go:334] "Generic (PLEG): container finished" podID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerID="0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb" exitCode=0 Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.188566 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sz5nj" event={"ID":"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c","Type":"ContainerDied","Data":"0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb"} Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.191602 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6tgf" event={"ID":"d815b880-6675-42e2-8380-3e1aaae065a7","Type":"ContainerStarted","Data":"865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8"} Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.204985 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" podStartSLOduration=85.803480206 podStartE2EDuration="2m10.204960491s" podCreationTimestamp="2026-03-10 15:08:00 +0000 UTC" firstStartedPulling="2026-03-10 15:09:25.042193036 +0000 UTC m=+229.749007784" lastFinishedPulling="2026-03-10 15:10:09.443673321 +0000 UTC m=+274.150488069" observedRunningTime="2026-03-10 15:10:10.202285411 +0000 UTC m=+274.909100159" watchObservedRunningTime="2026-03-10 15:10:10.204960491 +0000 UTC m=+274.911775239" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.226239 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" podStartSLOduration=22.226218588 podStartE2EDuration="22.226218588s" podCreationTimestamp="2026-03-10 15:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:10.223450035 +0000 UTC m=+274.930264793" watchObservedRunningTime="2026-03-10 15:10:10.226218588 +0000 UTC m=+274.933033336" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.322004 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.318315285 podStartE2EDuration="4.318315285s" podCreationTimestamp="2026-03-10 15:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:10.263616437 +0000 UTC m=+274.970431185" watchObservedRunningTime="2026-03-10 15:10:10.318315285 +0000 UTC m=+275.025130033" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.345353 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" podStartSLOduration=22.345327904 podStartE2EDuration="22.345327904s" podCreationTimestamp="2026-03-10 15:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:10.299755439 +0000 UTC m=+275.006570187" watchObservedRunningTime="2026-03-10 15:10:10.345327904 +0000 UTC m=+275.052142652" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.402418 4743 patch_prober.go:28] interesting pod/route-controller-manager-5fd4874df9-cjtbq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:58294->10.217.0.60:8443: read: connection reset by peer" start-of-body= Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.402531 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" podUID="2ca466c4-d03d-48cc-9953-17f89bb0e48d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:58294->10.217.0.60:8443: read: connection reset by peer" Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.604630 4743 csr.go:261] certificate signing request csr-2292s is approved, waiting to be issued Mar 10 15:10:10 crc kubenswrapper[4743]: I0310 15:10:10.617110 4743 csr.go:257] certificate signing request csr-2292s is issued Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.206354 4743 generic.go:334] "Generic (PLEG): container finished" podID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerID="11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80" exitCode=0 Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.206476 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxzjq" event={"ID":"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c","Type":"ContainerDied","Data":"11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80"} Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.208977 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5fd4874df9-cjtbq_2ca466c4-d03d-48cc-9953-17f89bb0e48d/route-controller-manager/0.log" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.209045 4743 generic.go:334] "Generic (PLEG): container finished" podID="2ca466c4-d03d-48cc-9953-17f89bb0e48d" containerID="4458862cd113abf3aa4087a7c6ee8df4a203c56856509d648a53f71c71153364" exitCode=255 Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.209125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" event={"ID":"2ca466c4-d03d-48cc-9953-17f89bb0e48d","Type":"ContainerDied","Data":"4458862cd113abf3aa4087a7c6ee8df4a203c56856509d648a53f71c71153364"} Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.210651 4743 generic.go:334] "Generic (PLEG): container finished" podID="e90f9844-e6c0-438e-8ad4-ee2a3916870e" containerID="a403c9df40f2632a67a4436c75cf3d1c6eb7a537bbaad47af53e6c28a3274060" exitCode=0 Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.210718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e90f9844-e6c0-438e-8ad4-ee2a3916870e","Type":"ContainerDied","Data":"a403c9df40f2632a67a4436c75cf3d1c6eb7a537bbaad47af53e6c28a3274060"} Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.214421 4743 generic.go:334] "Generic (PLEG): container finished" podID="0c56fd26-496a-4d08-8e5f-32cec30e295e" containerID="21f6efca8bc566fb8c37e448181b2da222d7a072918e652102dae86ae1be4fd8" exitCode=0 Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.214476 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" event={"ID":"0c56fd26-496a-4d08-8e5f-32cec30e295e","Type":"ContainerDied","Data":"21f6efca8bc566fb8c37e448181b2da222d7a072918e652102dae86ae1be4fd8"} Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.218246 4743 generic.go:334] "Generic (PLEG): container finished" podID="d815b880-6675-42e2-8380-3e1aaae065a7" containerID="865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8" exitCode=0 Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.218319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6tgf" event={"ID":"d815b880-6675-42e2-8380-3e1aaae065a7","Type":"ContainerDied","Data":"865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8"} Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.253299 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.253777 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.281034 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.282139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.296378 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.374008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-var-lock\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.374303 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.374500 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.476792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.476916 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.477048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-var-lock\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.477115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.477144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-var-lock\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.501652 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.619298 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-06 17:31:07.663157854 +0000 UTC Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.619360 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6506h20m56.043801619s for next certificate rotation Mar 10 15:10:11 crc kubenswrapper[4743]: I0310 15:10:11.648861 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.050570 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5fd4874df9-cjtbq_2ca466c4-d03d-48cc-9953-17f89bb0e48d/route-controller-manager/0.log" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.050969 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.077761 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z"] Mar 10 15:10:12 crc kubenswrapper[4743]: E0310 15:10:12.078100 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca466c4-d03d-48cc-9953-17f89bb0e48d" containerName="route-controller-manager" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.078116 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca466c4-d03d-48cc-9953-17f89bb0e48d" containerName="route-controller-manager" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.078256 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca466c4-d03d-48cc-9953-17f89bb0e48d" containerName="route-controller-manager" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.078783 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.090451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca466c4-d03d-48cc-9953-17f89bb0e48d-serving-cert\") pod \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.090497 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-client-ca\") pod \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.090642 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-config\") pod \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.090677 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gwb\" (UniqueName: \"kubernetes.io/projected/2ca466c4-d03d-48cc-9953-17f89bb0e48d-kube-api-access-c6gwb\") pod \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\" (UID: \"2ca466c4-d03d-48cc-9953-17f89bb0e48d\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.091442 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-client-ca" (OuterVolumeSpecName: "client-ca") pod "2ca466c4-d03d-48cc-9953-17f89bb0e48d" (UID: "2ca466c4-d03d-48cc-9953-17f89bb0e48d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.092632 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-config" (OuterVolumeSpecName: "config") pod "2ca466c4-d03d-48cc-9953-17f89bb0e48d" (UID: "2ca466c4-d03d-48cc-9953-17f89bb0e48d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.096965 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.099338 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca466c4-d03d-48cc-9953-17f89bb0e48d-kube-api-access-c6gwb" (OuterVolumeSpecName: "kube-api-access-c6gwb") pod "2ca466c4-d03d-48cc-9953-17f89bb0e48d" (UID: "2ca466c4-d03d-48cc-9953-17f89bb0e48d"). InnerVolumeSpecName "kube-api-access-c6gwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.100518 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca466c4-d03d-48cc-9953-17f89bb0e48d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2ca466c4-d03d-48cc-9953-17f89bb0e48d" (UID: "2ca466c4-d03d-48cc-9953-17f89bb0e48d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.103423 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z"] Mar 10 15:10:12 crc kubenswrapper[4743]: W0310 15:10:12.114089 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcbaf72e5_0721_41a6_957c_28ce7dbb7ff1.slice/crio-d3ea96daa2fea20a6ab7c493e127599308a06d5f398b80068e55e1c5993e016e WatchSource:0}: Error finding container d3ea96daa2fea20a6ab7c493e127599308a06d5f398b80068e55e1c5993e016e: Status 404 returned error can't find the container with id d3ea96daa2fea20a6ab7c493e127599308a06d5f398b80068e55e1c5993e016e Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.192194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-config\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.192651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-serving-cert\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.192696 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgfz\" (UniqueName: \"kubernetes.io/projected/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-kube-api-access-psgfz\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.192801 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-client-ca\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.192877 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca466c4-d03d-48cc-9953-17f89bb0e48d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.192895 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.192909 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca466c4-d03d-48cc-9953-17f89bb0e48d-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.192924 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gwb\" (UniqueName: \"kubernetes.io/projected/2ca466c4-d03d-48cc-9953-17f89bb0e48d-kube-api-access-c6gwb\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.229994 4743 generic.go:334] "Generic (PLEG): container finished" podID="0eb81c77-0afe-417e-a904-90a76e45f309" containerID="45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a" exitCode=0 Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.230106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hq6n" event={"ID":"0eb81c77-0afe-417e-a904-90a76e45f309","Type":"ContainerDied","Data":"45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a"} Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.232458 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1","Type":"ContainerStarted","Data":"d3ea96daa2fea20a6ab7c493e127599308a06d5f398b80068e55e1c5993e016e"} Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.235573 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5fd4874df9-cjtbq_2ca466c4-d03d-48cc-9953-17f89bb0e48d/route-controller-manager/0.log" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.235656 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" event={"ID":"2ca466c4-d03d-48cc-9953-17f89bb0e48d","Type":"ContainerDied","Data":"1700b2d50c98bb47526b7a66c939f2f2125ec48758838f10cb3d94295b0d75d3"} Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.235707 4743 scope.go:117] "RemoveContainer" containerID="4458862cd113abf3aa4087a7c6ee8df4a203c56856509d648a53f71c71153364" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.235900 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.252847 4743 generic.go:334] "Generic (PLEG): container finished" podID="7837cec9-3686-497f-b9ec-2525768cd8ce" containerID="3d3978e5601da2327dd3b88d62099deb80fe00f1900a29a2184ec135db7f1b97" exitCode=0 Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.252929 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" event={"ID":"7837cec9-3686-497f-b9ec-2525768cd8ce","Type":"ContainerDied","Data":"3d3978e5601da2327dd3b88d62099deb80fe00f1900a29a2184ec135db7f1b97"} Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.295550 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-client-ca\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.295605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-config\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.295698 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-serving-cert\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.295730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgfz\" (UniqueName: \"kubernetes.io/projected/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-kube-api-access-psgfz\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.297127 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-client-ca\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.308121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-serving-cert\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.314531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgfz\" (UniqueName: \"kubernetes.io/projected/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-kube-api-access-psgfz\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.344515 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq"] Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.359453 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-config\") pod \"route-controller-manager-58666b54d-4rr6z\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.367945 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fd4874df9-cjtbq"] Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.400732 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.467771 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.498234 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fp7j\" (UniqueName: \"kubernetes.io/projected/0c56fd26-496a-4d08-8e5f-32cec30e295e-kube-api-access-6fp7j\") pod \"0c56fd26-496a-4d08-8e5f-32cec30e295e\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.498323 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-proxy-ca-bundles\") pod \"0c56fd26-496a-4d08-8e5f-32cec30e295e\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.498367 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-config\") pod \"0c56fd26-496a-4d08-8e5f-32cec30e295e\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.498420 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c56fd26-496a-4d08-8e5f-32cec30e295e-serving-cert\") pod \"0c56fd26-496a-4d08-8e5f-32cec30e295e\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.498527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-client-ca\") pod \"0c56fd26-496a-4d08-8e5f-32cec30e295e\" (UID: \"0c56fd26-496a-4d08-8e5f-32cec30e295e\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.499845 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-client-ca" (OuterVolumeSpecName: "client-ca") pod "0c56fd26-496a-4d08-8e5f-32cec30e295e" (UID: "0c56fd26-496a-4d08-8e5f-32cec30e295e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.500699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0c56fd26-496a-4d08-8e5f-32cec30e295e" (UID: "0c56fd26-496a-4d08-8e5f-32cec30e295e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.500929 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-config" (OuterVolumeSpecName: "config") pod "0c56fd26-496a-4d08-8e5f-32cec30e295e" (UID: "0c56fd26-496a-4d08-8e5f-32cec30e295e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.532358 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c56fd26-496a-4d08-8e5f-32cec30e295e-kube-api-access-6fp7j" (OuterVolumeSpecName: "kube-api-access-6fp7j") pod "0c56fd26-496a-4d08-8e5f-32cec30e295e" (UID: "0c56fd26-496a-4d08-8e5f-32cec30e295e"). InnerVolumeSpecName "kube-api-access-6fp7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.532499 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c56fd26-496a-4d08-8e5f-32cec30e295e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0c56fd26-496a-4d08-8e5f-32cec30e295e" (UID: "0c56fd26-496a-4d08-8e5f-32cec30e295e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.599994 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fp7j\" (UniqueName: \"kubernetes.io/projected/0c56fd26-496a-4d08-8e5f-32cec30e295e-kube-api-access-6fp7j\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.600400 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.600411 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.600421 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c56fd26-496a-4d08-8e5f-32cec30e295e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.600432 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c56fd26-496a-4d08-8e5f-32cec30e295e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.620143 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-11 14:19:28.855347367 +0000 UTC Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.620199 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6623h9m16.235151359s for next certificate rotation Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.671471 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.705584 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kube-api-access\") pod \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\" (UID: \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.705734 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kubelet-dir\") pod \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\" (UID: \"e90f9844-e6c0-438e-8ad4-ee2a3916870e\") " Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.706131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e90f9844-e6c0-438e-8ad4-ee2a3916870e" (UID: "e90f9844-e6c0-438e-8ad4-ee2a3916870e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.718526 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e90f9844-e6c0-438e-8ad4-ee2a3916870e" (UID: "e90f9844-e6c0-438e-8ad4-ee2a3916870e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.807226 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.807277 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90f9844-e6c0-438e-8ad4-ee2a3916870e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4743]: I0310 15:10:12.969383 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z"] Mar 10 15:10:12 crc kubenswrapper[4743]: W0310 15:10:12.976216 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49e2136c_1c5a_4d3c_b522_ba960f3cf08e.slice/crio-4d7d903e45bd9693517bd6552d40e40a7717e93e5d50b16c3512f1737e96f800 WatchSource:0}: Error finding container 4d7d903e45bd9693517bd6552d40e40a7717e93e5d50b16c3512f1737e96f800: Status 404 returned error can't find the container with id 4d7d903e45bd9693517bd6552d40e40a7717e93e5d50b16c3512f1737e96f800 Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.263257 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" event={"ID":"49e2136c-1c5a-4d3c-b522-ba960f3cf08e","Type":"ContainerStarted","Data":"4d7d903e45bd9693517bd6552d40e40a7717e93e5d50b16c3512f1737e96f800"} Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.265541 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.265527 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74648876b-6z5s9" event={"ID":"0c56fd26-496a-4d08-8e5f-32cec30e295e","Type":"ContainerDied","Data":"092357e286cd381d9aeacca8b54fadfb9956c3028563ef43e3c79334d705c86e"} Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.265693 4743 scope.go:117] "RemoveContainer" containerID="21f6efca8bc566fb8c37e448181b2da222d7a072918e652102dae86ae1be4fd8" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.271108 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e90f9844-e6c0-438e-8ad4-ee2a3916870e","Type":"ContainerDied","Data":"eb0021500d47029fde79196749cd05d67e626e78748f512e6f6ad3beb97d855d"} Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.271478 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb0021500d47029fde79196749cd05d67e626e78748f512e6f6ad3beb97d855d" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.271202 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.272868 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1","Type":"ContainerStarted","Data":"05827689b49048499d1f79cee15a306e3ea075f4a14b58c5f917eb8326354106"} Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.297725 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.2976942510000002 podStartE2EDuration="2.297694251s" podCreationTimestamp="2026-03-10 15:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:13.293369712 +0000 UTC m=+278.000184460" watchObservedRunningTime="2026-03-10 15:10:13.297694251 +0000 UTC m=+278.004508999" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.313086 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74648876b-6z5s9"] Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.316365 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74648876b-6z5s9"] Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.508536 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.624490 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqfxn\" (UniqueName: \"kubernetes.io/projected/7837cec9-3686-497f-b9ec-2525768cd8ce-kube-api-access-lqfxn\") pod \"7837cec9-3686-497f-b9ec-2525768cd8ce\" (UID: \"7837cec9-3686-497f-b9ec-2525768cd8ce\") " Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.632084 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7837cec9-3686-497f-b9ec-2525768cd8ce-kube-api-access-lqfxn" (OuterVolumeSpecName: "kube-api-access-lqfxn") pod "7837cec9-3686-497f-b9ec-2525768cd8ce" (UID: "7837cec9-3686-497f-b9ec-2525768cd8ce"). InnerVolumeSpecName "kube-api-access-lqfxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.726446 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqfxn\" (UniqueName: \"kubernetes.io/projected/7837cec9-3686-497f-b9ec-2525768cd8ce-kube-api-access-lqfxn\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.923354 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c56fd26-496a-4d08-8e5f-32cec30e295e" path="/var/lib/kubelet/pods/0c56fd26-496a-4d08-8e5f-32cec30e295e/volumes" Mar 10 15:10:13 crc kubenswrapper[4743]: I0310 15:10:13.924486 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca466c4-d03d-48cc-9953-17f89bb0e48d" path="/var/lib/kubelet/pods/2ca466c4-d03d-48cc-9953-17f89bb0e48d/volumes" Mar 10 15:10:14 crc kubenswrapper[4743]: I0310 15:10:14.280303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" event={"ID":"7837cec9-3686-497f-b9ec-2525768cd8ce","Type":"ContainerDied","Data":"800d2fb78eca325b4edd7282a23ace9db1ba411818bead60dc6de13b9d5bc041"} Mar 10 15:10:14 crc kubenswrapper[4743]: I0310 15:10:14.280355 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800d2fb78eca325b4edd7282a23ace9db1ba411818bead60dc6de13b9d5bc041" Mar 10 15:10:14 crc kubenswrapper[4743]: I0310 15:10:14.280409 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-mtmtv" Mar 10 15:10:14 crc kubenswrapper[4743]: I0310 15:10:14.281830 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" event={"ID":"49e2136c-1c5a-4d3c-b522-ba960f3cf08e","Type":"ContainerStarted","Data":"870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d"} Mar 10 15:10:14 crc kubenswrapper[4743]: I0310 15:10:14.282305 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:14 crc kubenswrapper[4743]: I0310 15:10:14.307988 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" podStartSLOduration=6.307942915 podStartE2EDuration="6.307942915s" podCreationTimestamp="2026-03-10 15:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:14.302325807 +0000 UTC m=+279.009140555" watchObservedRunningTime="2026-03-10 15:10:14.307942915 +0000 UTC m=+279.014757673" Mar 10 15:10:14 crc kubenswrapper[4743]: I0310 15:10:14.347277 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.010931 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fbfb65594-s9tvf"] Mar 10 15:10:15 crc kubenswrapper[4743]: E0310 15:10:15.011662 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90f9844-e6c0-438e-8ad4-ee2a3916870e" containerName="pruner" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.011676 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90f9844-e6c0-438e-8ad4-ee2a3916870e" containerName="pruner" Mar 10 15:10:15 crc kubenswrapper[4743]: E0310 15:10:15.011689 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7837cec9-3686-497f-b9ec-2525768cd8ce" containerName="oc" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.011696 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7837cec9-3686-497f-b9ec-2525768cd8ce" containerName="oc" Mar 10 15:10:15 crc kubenswrapper[4743]: E0310 15:10:15.011709 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c56fd26-496a-4d08-8e5f-32cec30e295e" containerName="controller-manager" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.011716 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c56fd26-496a-4d08-8e5f-32cec30e295e" containerName="controller-manager" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.011834 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7837cec9-3686-497f-b9ec-2525768cd8ce" containerName="oc" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.011845 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90f9844-e6c0-438e-8ad4-ee2a3916870e" containerName="pruner" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.011856 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c56fd26-496a-4d08-8e5f-32cec30e295e" containerName="controller-manager" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.012323 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.015053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.015083 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.015602 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.018517 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.018951 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fbfb65594-s9tvf"] Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.019757 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.019991 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.039387 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.057605 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-proxy-ca-bundles\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.058019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9r5\" (UniqueName: \"kubernetes.io/projected/a1629308-2201-4101-b82b-e9c9e1f4903d-kube-api-access-2l9r5\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.058123 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-client-ca\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.058214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1629308-2201-4101-b82b-e9c9e1f4903d-serving-cert\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.058379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-config\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.160211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-config\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.160308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-proxy-ca-bundles\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.160336 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9r5\" (UniqueName: \"kubernetes.io/projected/a1629308-2201-4101-b82b-e9c9e1f4903d-kube-api-access-2l9r5\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.160373 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-client-ca\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.160400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1629308-2201-4101-b82b-e9c9e1f4903d-serving-cert\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.161424 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-client-ca\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.161593 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-config\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.162206 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-proxy-ca-bundles\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.173132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1629308-2201-4101-b82b-e9c9e1f4903d-serving-cert\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.179681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9r5\" (UniqueName: \"kubernetes.io/projected/a1629308-2201-4101-b82b-e9c9e1f4903d-kube-api-access-2l9r5\") pod \"controller-manager-5fbfb65594-s9tvf\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.292333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-2bvk6" event={"ID":"ac6ef377-422d-42a7-aedb-5adad149a2bc","Type":"ContainerStarted","Data":"2ae4666d4e659876cfcac4a569c66230e6faee8716566c2c5fe84bb6cb8bc3f1"} Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.295495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hq6n" event={"ID":"0eb81c77-0afe-417e-a904-90a76e45f309","Type":"ContainerStarted","Data":"859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd"} Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.298165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxzjq" event={"ID":"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c","Type":"ContainerStarted","Data":"7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d"} Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.319547 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2hq6n" podStartSLOduration=4.389172578 podStartE2EDuration="42.31952445s" podCreationTimestamp="2026-03-10 15:09:33 +0000 UTC" firstStartedPulling="2026-03-10 15:09:35.812724122 +0000 UTC m=+240.519538870" lastFinishedPulling="2026-03-10 15:10:13.743075994 +0000 UTC m=+278.449890742" observedRunningTime="2026-03-10 15:10:15.315632623 +0000 UTC m=+280.022447371" watchObservedRunningTime="2026-03-10 15:10:15.31952445 +0000 UTC m=+280.026339198" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.339145 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rxzjq" podStartSLOduration=2.694611213 podStartE2EDuration="43.339115066s" podCreationTimestamp="2026-03-10 15:09:32 +0000 UTC" firstStartedPulling="2026-03-10 15:09:33.748025864 +0000 UTC m=+238.454840612" lastFinishedPulling="2026-03-10 15:10:14.392529717 +0000 UTC m=+279.099344465" observedRunningTime="2026-03-10 15:10:15.338076375 +0000 UTC m=+280.044891133" watchObservedRunningTime="2026-03-10 15:10:15.339115066 +0000 UTC m=+280.045929804" Mar 10 15:10:15 crc kubenswrapper[4743]: I0310 15:10:15.340327 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:16 crc kubenswrapper[4743]: I0310 15:10:16.323408 4743 generic.go:334] "Generic (PLEG): container finished" podID="ac6ef377-422d-42a7-aedb-5adad149a2bc" containerID="2ae4666d4e659876cfcac4a569c66230e6faee8716566c2c5fe84bb6cb8bc3f1" exitCode=0 Mar 10 15:10:16 crc kubenswrapper[4743]: I0310 15:10:16.323568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-2bvk6" event={"ID":"ac6ef377-422d-42a7-aedb-5adad149a2bc","Type":"ContainerDied","Data":"2ae4666d4e659876cfcac4a569c66230e6faee8716566c2c5fe84bb6cb8bc3f1"} Mar 10 15:10:16 crc kubenswrapper[4743]: I0310 15:10:16.498426 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fbfb65594-s9tvf"] Mar 10 15:10:17 crc kubenswrapper[4743]: I0310 15:10:17.341036 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6tgf" event={"ID":"d815b880-6675-42e2-8380-3e1aaae065a7","Type":"ContainerStarted","Data":"852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d"} Mar 10 15:10:17 crc kubenswrapper[4743]: I0310 15:10:17.343041 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" event={"ID":"a1629308-2201-4101-b82b-e9c9e1f4903d","Type":"ContainerStarted","Data":"da535d66399fae0e559405af6fffe0ff95cdb0245a11fea13110f373cd995dd4"} Mar 10 15:10:17 crc kubenswrapper[4743]: I0310 15:10:17.368952 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f6tgf" podStartSLOduration=3.606560992 podStartE2EDuration="47.368928084s" podCreationTimestamp="2026-03-10 15:09:30 +0000 UTC" firstStartedPulling="2026-03-10 15:09:32.437403901 +0000 UTC m=+237.144218649" lastFinishedPulling="2026-03-10 15:10:16.199770993 +0000 UTC m=+280.906585741" observedRunningTime="2026-03-10 15:10:17.367375967 +0000 UTC m=+282.074190735" watchObservedRunningTime="2026-03-10 15:10:17.368928084 +0000 UTC m=+282.075742832" Mar 10 15:10:17 crc kubenswrapper[4743]: I0310 15:10:17.718999 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-2bvk6" Mar 10 15:10:17 crc kubenswrapper[4743]: I0310 15:10:17.817748 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snn7g\" (UniqueName: \"kubernetes.io/projected/ac6ef377-422d-42a7-aedb-5adad149a2bc-kube-api-access-snn7g\") pod \"ac6ef377-422d-42a7-aedb-5adad149a2bc\" (UID: \"ac6ef377-422d-42a7-aedb-5adad149a2bc\") " Mar 10 15:10:17 crc kubenswrapper[4743]: I0310 15:10:17.828068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6ef377-422d-42a7-aedb-5adad149a2bc-kube-api-access-snn7g" (OuterVolumeSpecName: "kube-api-access-snn7g") pod "ac6ef377-422d-42a7-aedb-5adad149a2bc" (UID: "ac6ef377-422d-42a7-aedb-5adad149a2bc"). InnerVolumeSpecName "kube-api-access-snn7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:17 crc kubenswrapper[4743]: I0310 15:10:17.920030 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snn7g\" (UniqueName: \"kubernetes.io/projected/ac6ef377-422d-42a7-aedb-5adad149a2bc-kube-api-access-snn7g\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:18 crc kubenswrapper[4743]: E0310 15:10:18.035632 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac6ef377_422d_42a7_aedb_5adad149a2bc.slice\": RecentStats: unable to find data in memory cache]" Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.350210 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-2bvk6" event={"ID":"ac6ef377-422d-42a7-aedb-5adad149a2bc","Type":"ContainerDied","Data":"0440b3f681f3581adec723053d15db07b272127c41e25919774e935ea50c7989"} Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.350262 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0440b3f681f3581adec723053d15db07b272127c41e25919774e935ea50c7989" Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.350339 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-2bvk6" Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.353935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sz5nj" event={"ID":"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c","Type":"ContainerStarted","Data":"b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653"} Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.356270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkgdp" event={"ID":"2b8d4a30-a71d-48ca-b702-772f0e08c566","Type":"ContainerStarted","Data":"4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1"} Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.359522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" event={"ID":"a1629308-2201-4101-b82b-e9c9e1f4903d","Type":"ContainerStarted","Data":"998f96ea6c598301a270969620d88fe5ec7ac675bca82e1acd3d3e81ff6be895"} Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.359561 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.364513 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.409008 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sz5nj" podStartSLOduration=4.05445056 podStartE2EDuration="46.408986461s" podCreationTimestamp="2026-03-10 15:09:32 +0000 UTC" firstStartedPulling="2026-03-10 15:09:34.788109848 +0000 UTC m=+239.494924596" lastFinishedPulling="2026-03-10 15:10:17.142645749 +0000 UTC m=+281.849460497" observedRunningTime="2026-03-10 15:10:18.3765417 +0000 UTC m=+283.083356448" watchObservedRunningTime="2026-03-10 15:10:18.408986461 +0000 UTC m=+283.115801209" Mar 10 15:10:18 crc kubenswrapper[4743]: I0310 15:10:18.441040 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" podStartSLOduration=10.4410111 podStartE2EDuration="10.4410111s" podCreationTimestamp="2026-03-10 15:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:18.43834918 +0000 UTC m=+283.145163938" watchObservedRunningTime="2026-03-10 15:10:18.4410111 +0000 UTC m=+283.147825848" Mar 10 15:10:19 crc kubenswrapper[4743]: I0310 15:10:19.367800 4743 generic.go:334] "Generic (PLEG): container finished" podID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerID="4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1" exitCode=0 Mar 10 15:10:19 crc kubenswrapper[4743]: I0310 15:10:19.367869 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkgdp" event={"ID":"2b8d4a30-a71d-48ca-b702-772f0e08c566","Type":"ContainerDied","Data":"4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1"} Mar 10 15:10:20 crc kubenswrapper[4743]: I0310 15:10:20.842771 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:10:20 crc kubenswrapper[4743]: I0310 15:10:20.843193 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:10:21 crc kubenswrapper[4743]: I0310 15:10:21.180673 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:10:21 crc kubenswrapper[4743]: I0310 15:10:21.383469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkgdp" event={"ID":"2b8d4a30-a71d-48ca-b702-772f0e08c566","Type":"ContainerStarted","Data":"fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f"} Mar 10 15:10:21 crc kubenswrapper[4743]: I0310 15:10:21.386264 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec21919b-a512-42f8-b1ce-80498821cb65" containerID="3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f" exitCode=0 Mar 10 15:10:21 crc kubenswrapper[4743]: I0310 15:10:21.386359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4q2k" event={"ID":"ec21919b-a512-42f8-b1ce-80498821cb65","Type":"ContainerDied","Data":"3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f"} Mar 10 15:10:21 crc kubenswrapper[4743]: I0310 15:10:21.388992 4743 generic.go:334] "Generic (PLEG): container finished" podID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerID="5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe" exitCode=0 Mar 10 15:10:21 crc kubenswrapper[4743]: I0310 15:10:21.389107 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qbmf" event={"ID":"5fb16ec3-618c-4095-a3e7-3f59920d921b","Type":"ContainerDied","Data":"5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe"} Mar 10 15:10:21 crc kubenswrapper[4743]: I0310 15:10:21.409972 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mkgdp" podStartSLOduration=3.066972504 podStartE2EDuration="51.409947433s" podCreationTimestamp="2026-03-10 15:09:30 +0000 UTC" firstStartedPulling="2026-03-10 15:09:32.443150978 +0000 UTC m=+237.149965726" lastFinishedPulling="2026-03-10 15:10:20.786125907 +0000 UTC m=+285.492940655" observedRunningTime="2026-03-10 15:10:21.407730776 +0000 UTC m=+286.114545524" watchObservedRunningTime="2026-03-10 15:10:21.409947433 +0000 UTC m=+286.116762181" Mar 10 15:10:21 crc kubenswrapper[4743]: I0310 15:10:21.440606 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.397241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4q2k" event={"ID":"ec21919b-a512-42f8-b1ce-80498821cb65","Type":"ContainerStarted","Data":"b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274"} Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.400681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qbmf" event={"ID":"5fb16ec3-618c-4095-a3e7-3f59920d921b","Type":"ContainerStarted","Data":"1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8"} Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.463036 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h4q2k" podStartSLOduration=2.925903354 podStartE2EDuration="52.463012189s" podCreationTimestamp="2026-03-10 15:09:30 +0000 UTC" firstStartedPulling="2026-03-10 15:09:32.478077057 +0000 UTC m=+237.184891805" lastFinishedPulling="2026-03-10 15:10:22.015185892 +0000 UTC m=+286.722000640" observedRunningTime="2026-03-10 15:10:22.428537067 +0000 UTC m=+287.135351825" watchObservedRunningTime="2026-03-10 15:10:22.463012189 +0000 UTC m=+287.169826937" Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.465220 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8qbmf" podStartSLOduration=2.9382545220000003 podStartE2EDuration="52.465213665s" podCreationTimestamp="2026-03-10 15:09:30 +0000 UTC" firstStartedPulling="2026-03-10 15:09:32.428553403 +0000 UTC m=+237.135368151" lastFinishedPulling="2026-03-10 15:10:21.955512546 +0000 UTC m=+286.662327294" observedRunningTime="2026-03-10 15:10:22.46104005 +0000 UTC m=+287.167854798" watchObservedRunningTime="2026-03-10 15:10:22.465213665 +0000 UTC m=+287.172028413" Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.578162 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.578250 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.642839 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.991739 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:10:22 crc kubenswrapper[4743]: I0310 15:10:22.991850 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:10:23 crc kubenswrapper[4743]: I0310 15:10:23.043276 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:10:23 crc kubenswrapper[4743]: I0310 15:10:23.456862 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:10:23 crc kubenswrapper[4743]: I0310 15:10:23.476437 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:10:24 crc kubenswrapper[4743]: I0310 15:10:24.052652 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:10:24 crc kubenswrapper[4743]: I0310 15:10:24.052713 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:10:25 crc kubenswrapper[4743]: I0310 15:10:25.099176 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hq6n" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="registry-server" probeResult="failure" output=< Mar 10 15:10:25 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 15:10:25 crc kubenswrapper[4743]: > Mar 10 15:10:25 crc kubenswrapper[4743]: I0310 15:10:25.422318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv2mc" event={"ID":"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493","Type":"ContainerStarted","Data":"9bfdba87ca7e76147a4044cc46eb7fe6b766a43bee86cd91e75edfb2c32f18d6"} Mar 10 15:10:26 crc kubenswrapper[4743]: I0310 15:10:26.397604 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sz5nj"] Mar 10 15:10:26 crc kubenswrapper[4743]: I0310 15:10:26.399651 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sz5nj" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerName="registry-server" containerID="cri-o://b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653" gracePeriod=2 Mar 10 15:10:26 crc kubenswrapper[4743]: I0310 15:10:26.431041 4743 generic.go:334] "Generic (PLEG): container finished" podID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerID="9bfdba87ca7e76147a4044cc46eb7fe6b766a43bee86cd91e75edfb2c32f18d6" exitCode=0 Mar 10 15:10:26 crc kubenswrapper[4743]: I0310 15:10:26.431090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv2mc" event={"ID":"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493","Type":"ContainerDied","Data":"9bfdba87ca7e76147a4044cc46eb7fe6b766a43bee86cd91e75edfb2c32f18d6"} Mar 10 15:10:26 crc kubenswrapper[4743]: I0310 15:10:26.988846 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.068766 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcmff\" (UniqueName: \"kubernetes.io/projected/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-kube-api-access-pcmff\") pod \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.068853 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-utilities\") pod \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.068911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-catalog-content\") pod \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\" (UID: \"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c\") " Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.069974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-utilities" (OuterVolumeSpecName: "utilities") pod "8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" (UID: "8830b2a3-e8f3-48d0-85ca-5c936ca0a04c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.076107 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-kube-api-access-pcmff" (OuterVolumeSpecName: "kube-api-access-pcmff") pod "8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" (UID: "8830b2a3-e8f3-48d0-85ca-5c936ca0a04c"). InnerVolumeSpecName "kube-api-access-pcmff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.101301 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" (UID: "8830b2a3-e8f3-48d0-85ca-5c936ca0a04c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.170563 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcmff\" (UniqueName: \"kubernetes.io/projected/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-kube-api-access-pcmff\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.170612 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.170623 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.442379 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sz5nj" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.443142 4743 generic.go:334] "Generic (PLEG): container finished" podID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerID="b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653" exitCode=0 Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.443253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sz5nj" event={"ID":"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c","Type":"ContainerDied","Data":"b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653"} Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.443372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sz5nj" event={"ID":"8830b2a3-e8f3-48d0-85ca-5c936ca0a04c","Type":"ContainerDied","Data":"1357b05974f924dd542acf59bcdefb3de38e9a1e72760b7c184cb9df6fcd0ded"} Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.443403 4743 scope.go:117] "RemoveContainer" containerID="b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.448913 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv2mc" event={"ID":"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493","Type":"ContainerStarted","Data":"49f1f5f54716a4e436784ed8625e48acc50e3bf824860b239c314f1014ec66fc"} Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.472538 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cv2mc" podStartSLOduration=2.399573255 podStartE2EDuration="53.472517614s" podCreationTimestamp="2026-03-10 15:09:34 +0000 UTC" firstStartedPulling="2026-03-10 15:09:35.820495608 +0000 UTC m=+240.527310366" lastFinishedPulling="2026-03-10 15:10:26.893439977 +0000 UTC m=+291.600254725" observedRunningTime="2026-03-10 15:10:27.46671547 +0000 UTC m=+292.173530218" watchObservedRunningTime="2026-03-10 15:10:27.472517614 +0000 UTC m=+292.179332362" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.477019 4743 scope.go:117] "RemoveContainer" containerID="0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.481542 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sz5nj"] Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.490863 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sz5nj"] Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.510700 4743 scope.go:117] "RemoveContainer" containerID="1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.526828 4743 scope.go:117] "RemoveContainer" containerID="b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653" Mar 10 15:10:27 crc kubenswrapper[4743]: E0310 15:10:27.527388 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653\": container with ID starting with b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653 not found: ID does not exist" containerID="b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.527423 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653"} err="failed to get container status \"b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653\": rpc error: code = NotFound desc = could not find container \"b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653\": container with ID starting with b45df2687fedc35a321bce23b1bf2ed1f176aea15a326c8f17e0e21e7b2bd653 not found: ID does not exist" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.527448 4743 scope.go:117] "RemoveContainer" containerID="0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb" Mar 10 15:10:27 crc kubenswrapper[4743]: E0310 15:10:27.528045 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb\": container with ID starting with 0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb not found: ID does not exist" containerID="0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.528068 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb"} err="failed to get container status \"0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb\": rpc error: code = NotFound desc = could not find container \"0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb\": container with ID starting with 0154ea918e451195ca278501cebc32bb107bf7ffbb768184fb91411e339ed9cb not found: ID does not exist" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.528085 4743 scope.go:117] "RemoveContainer" containerID="1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063" Mar 10 15:10:27 crc kubenswrapper[4743]: E0310 15:10:27.528357 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063\": container with ID starting with 1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063 not found: ID does not exist" containerID="1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.528378 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063"} err="failed to get container status \"1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063\": rpc error: code = NotFound desc = could not find container \"1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063\": container with ID starting with 1c5229373b0885fc4250e430d808c0ba4ec82d22956c5d2febe69cf859eec063 not found: ID does not exist" Mar 10 15:10:27 crc kubenswrapper[4743]: I0310 15:10:27.921873 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" path="/var/lib/kubelet/pods/8830b2a3-e8f3-48d0-85ca-5c936ca0a04c/volumes" Mar 10 15:10:28 crc kubenswrapper[4743]: I0310 15:10:28.572553 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fbfb65594-s9tvf"] Mar 10 15:10:28 crc kubenswrapper[4743]: I0310 15:10:28.572989 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" podUID="a1629308-2201-4101-b82b-e9c9e1f4903d" containerName="controller-manager" containerID="cri-o://998f96ea6c598301a270969620d88fe5ec7ac675bca82e1acd3d3e81ff6be895" gracePeriod=30 Mar 10 15:10:28 crc kubenswrapper[4743]: I0310 15:10:28.609989 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z"] Mar 10 15:10:28 crc kubenswrapper[4743]: I0310 15:10:28.610205 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" podUID="49e2136c-1c5a-4d3c-b522-ba960f3cf08e" containerName="route-controller-manager" containerID="cri-o://870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d" gracePeriod=30 Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.170763 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.199923 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-config\") pod \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.200069 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-client-ca\") pod \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.200109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-serving-cert\") pod \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.200193 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psgfz\" (UniqueName: \"kubernetes.io/projected/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-kube-api-access-psgfz\") pod \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\" (UID: \"49e2136c-1c5a-4d3c-b522-ba960f3cf08e\") " Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.200937 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-client-ca" (OuterVolumeSpecName: "client-ca") pod "49e2136c-1c5a-4d3c-b522-ba960f3cf08e" (UID: "49e2136c-1c5a-4d3c-b522-ba960f3cf08e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.201594 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-config" (OuterVolumeSpecName: "config") pod "49e2136c-1c5a-4d3c-b522-ba960f3cf08e" (UID: "49e2136c-1c5a-4d3c-b522-ba960f3cf08e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.212600 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-kube-api-access-psgfz" (OuterVolumeSpecName: "kube-api-access-psgfz") pod "49e2136c-1c5a-4d3c-b522-ba960f3cf08e" (UID: "49e2136c-1c5a-4d3c-b522-ba960f3cf08e"). InnerVolumeSpecName "kube-api-access-psgfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.214040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "49e2136c-1c5a-4d3c-b522-ba960f3cf08e" (UID: "49e2136c-1c5a-4d3c-b522-ba960f3cf08e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.302778 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.303196 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.303295 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psgfz\" (UniqueName: \"kubernetes.io/projected/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-kube-api-access-psgfz\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.303371 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e2136c-1c5a-4d3c-b522-ba960f3cf08e-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.465728 4743 generic.go:334] "Generic (PLEG): container finished" podID="a1629308-2201-4101-b82b-e9c9e1f4903d" containerID="998f96ea6c598301a270969620d88fe5ec7ac675bca82e1acd3d3e81ff6be895" exitCode=0 Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.465834 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" event={"ID":"a1629308-2201-4101-b82b-e9c9e1f4903d","Type":"ContainerDied","Data":"998f96ea6c598301a270969620d88fe5ec7ac675bca82e1acd3d3e81ff6be895"} Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.467408 4743 generic.go:334] "Generic (PLEG): container finished" podID="49e2136c-1c5a-4d3c-b522-ba960f3cf08e" containerID="870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d" exitCode=0 Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.467439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" event={"ID":"49e2136c-1c5a-4d3c-b522-ba960f3cf08e","Type":"ContainerDied","Data":"870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d"} Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.467460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" event={"ID":"49e2136c-1c5a-4d3c-b522-ba960f3cf08e","Type":"ContainerDied","Data":"4d7d903e45bd9693517bd6552d40e40a7717e93e5d50b16c3512f1737e96f800"} Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.467484 4743 scope.go:117] "RemoveContainer" containerID="870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.467577 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.510371 4743 scope.go:117] "RemoveContainer" containerID="870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d" Mar 10 15:10:29 crc kubenswrapper[4743]: E0310 15:10:29.513933 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d\": container with ID starting with 870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d not found: ID does not exist" containerID="870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.513979 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d"} err="failed to get container status \"870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d\": rpc error: code = NotFound desc = could not find container \"870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d\": container with ID starting with 870485551d84ad96211fb17236b78da8721401c5e2d083dfc74f5895fe72628d not found: ID does not exist" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.517281 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z"] Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.523435 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58666b54d-4rr6z"] Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.856712 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:29 crc kubenswrapper[4743]: I0310 15:10:29.923556 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e2136c-1c5a-4d3c-b522-ba960f3cf08e" path="/var/lib/kubelet/pods/49e2136c-1c5a-4d3c-b522-ba960f3cf08e/volumes" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.017296 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1629308-2201-4101-b82b-e9c9e1f4903d-serving-cert\") pod \"a1629308-2201-4101-b82b-e9c9e1f4903d\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.017367 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-proxy-ca-bundles\") pod \"a1629308-2201-4101-b82b-e9c9e1f4903d\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.017430 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-client-ca\") pod \"a1629308-2201-4101-b82b-e9c9e1f4903d\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.017472 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-config\") pod \"a1629308-2201-4101-b82b-e9c9e1f4903d\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.017511 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l9r5\" (UniqueName: \"kubernetes.io/projected/a1629308-2201-4101-b82b-e9c9e1f4903d-kube-api-access-2l9r5\") pod \"a1629308-2201-4101-b82b-e9c9e1f4903d\" (UID: \"a1629308-2201-4101-b82b-e9c9e1f4903d\") " Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.019565 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a1629308-2201-4101-b82b-e9c9e1f4903d" (UID: "a1629308-2201-4101-b82b-e9c9e1f4903d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.019616 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-client-ca" (OuterVolumeSpecName: "client-ca") pod "a1629308-2201-4101-b82b-e9c9e1f4903d" (UID: "a1629308-2201-4101-b82b-e9c9e1f4903d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.020115 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-config" (OuterVolumeSpecName: "config") pod "a1629308-2201-4101-b82b-e9c9e1f4903d" (UID: "a1629308-2201-4101-b82b-e9c9e1f4903d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023190 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b"] Mar 10 15:10:30 crc kubenswrapper[4743]: E0310 15:10:30.023580 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerName="extract-content" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023607 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerName="extract-content" Mar 10 15:10:30 crc kubenswrapper[4743]: E0310 15:10:30.023630 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e2136c-1c5a-4d3c-b522-ba960f3cf08e" containerName="route-controller-manager" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023643 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e2136c-1c5a-4d3c-b522-ba960f3cf08e" containerName="route-controller-manager" Mar 10 15:10:30 crc kubenswrapper[4743]: E0310 15:10:30.023667 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6ef377-422d-42a7-aedb-5adad149a2bc" containerName="oc" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023678 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6ef377-422d-42a7-aedb-5adad149a2bc" containerName="oc" Mar 10 15:10:30 crc kubenswrapper[4743]: E0310 15:10:30.023692 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerName="registry-server" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023701 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerName="registry-server" Mar 10 15:10:30 crc kubenswrapper[4743]: E0310 15:10:30.023722 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1629308-2201-4101-b82b-e9c9e1f4903d" containerName="controller-manager" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023731 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1629308-2201-4101-b82b-e9c9e1f4903d" containerName="controller-manager" Mar 10 15:10:30 crc kubenswrapper[4743]: E0310 15:10:30.023747 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerName="extract-utilities" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023757 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerName="extract-utilities" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023915 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6ef377-422d-42a7-aedb-5adad149a2bc" containerName="oc" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023933 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1629308-2201-4101-b82b-e9c9e1f4903d" containerName="controller-manager" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023947 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e2136c-1c5a-4d3c-b522-ba960f3cf08e" containerName="route-controller-manager" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.023958 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8830b2a3-e8f3-48d0-85ca-5c936ca0a04c" containerName="registry-server" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.024550 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.026272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1629308-2201-4101-b82b-e9c9e1f4903d-kube-api-access-2l9r5" (OuterVolumeSpecName: "kube-api-access-2l9r5") pod "a1629308-2201-4101-b82b-e9c9e1f4903d" (UID: "a1629308-2201-4101-b82b-e9c9e1f4903d"). InnerVolumeSpecName "kube-api-access-2l9r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.026610 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1629308-2201-4101-b82b-e9c9e1f4903d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a1629308-2201-4101-b82b-e9c9e1f4903d" (UID: "a1629308-2201-4101-b82b-e9c9e1f4903d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.027694 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7865496797-txnhr"] Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.027980 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.028260 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.028447 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.028912 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.029422 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.029761 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.029994 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.032838 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b"] Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.036474 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7865496797-txnhr"] Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.119290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-serving-cert\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.119350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5xbv\" (UniqueName: \"kubernetes.io/projected/675e18a7-71e0-455c-a7a8-a99e10cc6576-kube-api-access-c5xbv\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.119383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4xww\" (UniqueName: \"kubernetes.io/projected/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-kube-api-access-m4xww\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.119662 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-client-ca\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.119783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-proxy-ca-bundles\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.119888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675e18a7-71e0-455c-a7a8-a99e10cc6576-serving-cert\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.120197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-config\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.120281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-config\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.120356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-client-ca\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.120593 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1629308-2201-4101-b82b-e9c9e1f4903d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.120614 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.120627 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.120644 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1629308-2201-4101-b82b-e9c9e1f4903d-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.120657 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l9r5\" (UniqueName: \"kubernetes.io/projected/a1629308-2201-4101-b82b-e9c9e1f4903d-kube-api-access-2l9r5\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.222239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-serving-cert\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.222298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5xbv\" (UniqueName: \"kubernetes.io/projected/675e18a7-71e0-455c-a7a8-a99e10cc6576-kube-api-access-c5xbv\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.222339 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4xww\" (UniqueName: \"kubernetes.io/projected/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-kube-api-access-m4xww\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.222380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-client-ca\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.222407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-proxy-ca-bundles\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.223602 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-client-ca\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.223761 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675e18a7-71e0-455c-a7a8-a99e10cc6576-serving-cert\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.223832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-config\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.223868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-config\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.223924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-client-ca\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.224084 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-proxy-ca-bundles\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.224753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-client-ca\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.225227 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-config\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.226003 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-config\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.226136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-serving-cert\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.228834 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675e18a7-71e0-455c-a7a8-a99e10cc6576-serving-cert\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.241044 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5xbv\" (UniqueName: \"kubernetes.io/projected/675e18a7-71e0-455c-a7a8-a99e10cc6576-kube-api-access-c5xbv\") pod \"route-controller-manager-588c99c9b4-f8k8b\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.242993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4xww\" (UniqueName: \"kubernetes.io/projected/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-kube-api-access-m4xww\") pod \"controller-manager-7865496797-txnhr\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.384845 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:30 crc kubenswrapper[4743]: I0310 15:10:30.394564 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.492737 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" event={"ID":"a1629308-2201-4101-b82b-e9c9e1f4903d","Type":"ContainerDied","Data":"da535d66399fae0e559405af6fffe0ff95cdb0245a11fea13110f373cd995dd4"} Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.493215 4743 scope.go:117] "RemoveContainer" containerID="998f96ea6c598301a270969620d88fe5ec7ac675bca82e1acd3d3e81ff6be895" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.493403 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbfb65594-s9tvf" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.563103 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fbfb65594-s9tvf"] Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.570970 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5fbfb65594-s9tvf"] Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.661576 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.661682 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.708176 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:30.972298 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" podUID="9164077f-fddc-43e6-9aac-23a8be818d9f" containerName="oauth-openshift" containerID="cri-o://19d23ff6fa2796978a844d0c9165002eedc6b343d42db71f24adbca6f6ea9f3f" gracePeriod=15 Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.015530 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.016463 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.057533 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.227492 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.227607 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.266950 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.500878 4743 generic.go:334] "Generic (PLEG): container finished" podID="9164077f-fddc-43e6-9aac-23a8be818d9f" containerID="19d23ff6fa2796978a844d0c9165002eedc6b343d42db71f24adbca6f6ea9f3f" exitCode=0 Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.501202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" event={"ID":"9164077f-fddc-43e6-9aac-23a8be818d9f","Type":"ContainerDied","Data":"19d23ff6fa2796978a844d0c9165002eedc6b343d42db71f24adbca6f6ea9f3f"} Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.540409 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.542555 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.544185 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.687829 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b"] Mar 10 15:10:31 crc kubenswrapper[4743]: W0310 15:10:31.706148 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod675e18a7_71e0_455c_a7a8_a99e10cc6576.slice/crio-a0e34e3b148c44d6af512efc504f98d4e62710299cbb79ecdc3f598e14c5f878 WatchSource:0}: Error finding container a0e34e3b148c44d6af512efc504f98d4e62710299cbb79ecdc3f598e14c5f878: Status 404 returned error can't find the container with id a0e34e3b148c44d6af512efc504f98d4e62710299cbb79ecdc3f598e14c5f878 Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.710781 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7865496797-txnhr"] Mar 10 15:10:31 crc kubenswrapper[4743]: W0310 15:10:31.726160 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c9a24c6_acae_43c2_a1a9_9d1e7297daf9.slice/crio-85fcbd68c837db79ec22060bb992947753f2bdbb10b6a659515700748209197e WatchSource:0}: Error finding container 85fcbd68c837db79ec22060bb992947753f2bdbb10b6a659515700748209197e: Status 404 returned error can't find the container with id 85fcbd68c837db79ec22060bb992947753f2bdbb10b6a659515700748209197e Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.921780 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:10:31 crc kubenswrapper[4743]: I0310 15:10:31.923007 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1629308-2201-4101-b82b-e9c9e1f4903d" path="/var/lib/kubelet/pods/a1629308-2201-4101-b82b-e9c9e1f4903d/volumes" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053336 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-policies\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053475 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-router-certs\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053520 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-cliconfig\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053564 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-login\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053615 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-provider-selection\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053649 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-idp-0-file-data\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053685 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-serving-cert\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053734 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-trusted-ca-bundle\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053763 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-session\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053827 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rndjn\" (UniqueName: \"kubernetes.io/projected/9164077f-fddc-43e6-9aac-23a8be818d9f-kube-api-access-rndjn\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-ocp-branding-template\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.053948 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-error\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.054000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-service-ca\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.054077 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-dir\") pod \"9164077f-fddc-43e6-9aac-23a8be818d9f\" (UID: \"9164077f-fddc-43e6-9aac-23a8be818d9f\") " Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.054507 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.054544 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.055737 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.055916 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.055985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.061542 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.062052 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.062054 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9164077f-fddc-43e6-9aac-23a8be818d9f-kube-api-access-rndjn" (OuterVolumeSpecName: "kube-api-access-rndjn") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "kube-api-access-rndjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.062230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.063418 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.063785 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.064953 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.065573 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.066324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9164077f-fddc-43e6-9aac-23a8be818d9f" (UID: "9164077f-fddc-43e6-9aac-23a8be818d9f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155220 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155265 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155279 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155291 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155300 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155311 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155320 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rndjn\" (UniqueName: \"kubernetes.io/projected/9164077f-fddc-43e6-9aac-23a8be818d9f-kube-api-access-rndjn\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155328 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155338 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155348 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155358 4743 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155368 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155378 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.155387 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9164077f-fddc-43e6-9aac-23a8be818d9f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.201481 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h4q2k"] Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.511046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" event={"ID":"675e18a7-71e0-455c-a7a8-a99e10cc6576","Type":"ContainerStarted","Data":"2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142"} Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.511118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" event={"ID":"675e18a7-71e0-455c-a7a8-a99e10cc6576","Type":"ContainerStarted","Data":"a0e34e3b148c44d6af512efc504f98d4e62710299cbb79ecdc3f598e14c5f878"} Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.512184 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.514627 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" event={"ID":"9164077f-fddc-43e6-9aac-23a8be818d9f","Type":"ContainerDied","Data":"4d2e9781747ecc9bf22a968b648fc31eb1dbd2e209ee5ab625974d241775daee"} Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.514703 4743 scope.go:117] "RemoveContainer" containerID="19d23ff6fa2796978a844d0c9165002eedc6b343d42db71f24adbca6f6ea9f3f" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.515113 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l2lf6" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.516925 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" event={"ID":"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9","Type":"ContainerStarted","Data":"41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851"} Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.517032 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" event={"ID":"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9","Type":"ContainerStarted","Data":"85fcbd68c837db79ec22060bb992947753f2bdbb10b6a659515700748209197e"} Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.517530 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.526141 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.541980 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" podStartSLOduration=4.54193622 podStartE2EDuration="4.54193622s" podCreationTimestamp="2026-03-10 15:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:32.540212618 +0000 UTC m=+297.247027376" watchObservedRunningTime="2026-03-10 15:10:32.54193622 +0000 UTC m=+297.248750968" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.554164 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2lf6"] Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.562861 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2lf6"] Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.620487 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:32 crc kubenswrapper[4743]: I0310 15:10:32.646322 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" podStartSLOduration=4.646287324 podStartE2EDuration="4.646287324s" podCreationTimestamp="2026-03-10 15:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:32.603186803 +0000 UTC m=+297.310001551" watchObservedRunningTime="2026-03-10 15:10:32.646287324 +0000 UTC m=+297.353102072" Mar 10 15:10:33 crc kubenswrapper[4743]: I0310 15:10:33.201890 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mkgdp"] Mar 10 15:10:33 crc kubenswrapper[4743]: I0310 15:10:33.524110 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h4q2k" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" containerName="registry-server" containerID="cri-o://b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274" gracePeriod=2 Mar 10 15:10:33 crc kubenswrapper[4743]: I0310 15:10:33.524360 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mkgdp" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerName="registry-server" containerID="cri-o://fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f" gracePeriod=2 Mar 10 15:10:33 crc kubenswrapper[4743]: I0310 15:10:33.923327 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9164077f-fddc-43e6-9aac-23a8be818d9f" path="/var/lib/kubelet/pods/9164077f-fddc-43e6-9aac-23a8be818d9f/volumes" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.051061 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.056566 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.101300 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.139495 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.183296 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-catalog-content\") pod \"ec21919b-a512-42f8-b1ce-80498821cb65\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.183378 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-utilities\") pod \"ec21919b-a512-42f8-b1ce-80498821cb65\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.183432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9z62\" (UniqueName: \"kubernetes.io/projected/2b8d4a30-a71d-48ca-b702-772f0e08c566-kube-api-access-s9z62\") pod \"2b8d4a30-a71d-48ca-b702-772f0e08c566\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.183477 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq52p\" (UniqueName: \"kubernetes.io/projected/ec21919b-a512-42f8-b1ce-80498821cb65-kube-api-access-lq52p\") pod \"ec21919b-a512-42f8-b1ce-80498821cb65\" (UID: \"ec21919b-a512-42f8-b1ce-80498821cb65\") " Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.183537 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-utilities\") pod \"2b8d4a30-a71d-48ca-b702-772f0e08c566\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.183658 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-catalog-content\") pod \"2b8d4a30-a71d-48ca-b702-772f0e08c566\" (UID: \"2b8d4a30-a71d-48ca-b702-772f0e08c566\") " Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.184568 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-utilities" (OuterVolumeSpecName: "utilities") pod "2b8d4a30-a71d-48ca-b702-772f0e08c566" (UID: "2b8d4a30-a71d-48ca-b702-772f0e08c566"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.184951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-utilities" (OuterVolumeSpecName: "utilities") pod "ec21919b-a512-42f8-b1ce-80498821cb65" (UID: "ec21919b-a512-42f8-b1ce-80498821cb65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.185804 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.185846 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.190991 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8d4a30-a71d-48ca-b702-772f0e08c566-kube-api-access-s9z62" (OuterVolumeSpecName: "kube-api-access-s9z62") pod "2b8d4a30-a71d-48ca-b702-772f0e08c566" (UID: "2b8d4a30-a71d-48ca-b702-772f0e08c566"). InnerVolumeSpecName "kube-api-access-s9z62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.191220 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec21919b-a512-42f8-b1ce-80498821cb65-kube-api-access-lq52p" (OuterVolumeSpecName: "kube-api-access-lq52p") pod "ec21919b-a512-42f8-b1ce-80498821cb65" (UID: "ec21919b-a512-42f8-b1ce-80498821cb65"). InnerVolumeSpecName "kube-api-access-lq52p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.241265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b8d4a30-a71d-48ca-b702-772f0e08c566" (UID: "2b8d4a30-a71d-48ca-b702-772f0e08c566"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.241694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec21919b-a512-42f8-b1ce-80498821cb65" (UID: "ec21919b-a512-42f8-b1ce-80498821cb65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.287978 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8d4a30-a71d-48ca-b702-772f0e08c566-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.288022 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec21919b-a512-42f8-b1ce-80498821cb65-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.288038 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9z62\" (UniqueName: \"kubernetes.io/projected/2b8d4a30-a71d-48ca-b702-772f0e08c566-kube-api-access-s9z62\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.288055 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq52p\" (UniqueName: \"kubernetes.io/projected/ec21919b-a512-42f8-b1ce-80498821cb65-kube-api-access-lq52p\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.423156 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.423231 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.473225 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.534422 4743 generic.go:334] "Generic (PLEG): container finished" podID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerID="fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f" exitCode=0 Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.534534 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkgdp" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.534542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkgdp" event={"ID":"2b8d4a30-a71d-48ca-b702-772f0e08c566","Type":"ContainerDied","Data":"fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f"} Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.534614 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkgdp" event={"ID":"2b8d4a30-a71d-48ca-b702-772f0e08c566","Type":"ContainerDied","Data":"bdfd5d5bc6275aeb664301305fe9c2c9028696c83bcc232ec5e428a200a13a82"} Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.534642 4743 scope.go:117] "RemoveContainer" containerID="fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.537641 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec21919b-a512-42f8-b1ce-80498821cb65" containerID="b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274" exitCode=0 Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.538545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4q2k" event={"ID":"ec21919b-a512-42f8-b1ce-80498821cb65","Type":"ContainerDied","Data":"b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274"} Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.538966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4q2k" event={"ID":"ec21919b-a512-42f8-b1ce-80498821cb65","Type":"ContainerDied","Data":"5a64fccb54b52cc75f6bc55d154d7dcc61cd38cf86ec49a9b953c4d9f21e89b5"} Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.538608 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4q2k" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.553639 4743 scope.go:117] "RemoveContainer" containerID="4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.567792 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mkgdp"] Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.570690 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mkgdp"] Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.585751 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h4q2k"] Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.589369 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h4q2k"] Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.593690 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.593761 4743 scope.go:117] "RemoveContainer" containerID="77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.618024 4743 scope.go:117] "RemoveContainer" containerID="fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f" Mar 10 15:10:34 crc kubenswrapper[4743]: E0310 15:10:34.620227 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f\": container with ID starting with fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f not found: ID does not exist" containerID="fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.620265 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f"} err="failed to get container status \"fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f\": rpc error: code = NotFound desc = could not find container \"fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f\": container with ID starting with fb5a1a17824583d379f9530979ac6f5254871805898fa58a8834e6df367bf90f not found: ID does not exist" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.620288 4743 scope.go:117] "RemoveContainer" containerID="4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1" Mar 10 15:10:34 crc kubenswrapper[4743]: E0310 15:10:34.621053 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1\": container with ID starting with 4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1 not found: ID does not exist" containerID="4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.621080 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1"} err="failed to get container status \"4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1\": rpc error: code = NotFound desc = could not find container \"4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1\": container with ID starting with 4d8b9f871c7955265bc5ccf5818d2f9d3610e27cfa79d8cc40526bd75d5a6ae1 not found: ID does not exist" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.621104 4743 scope.go:117] "RemoveContainer" containerID="77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480" Mar 10 15:10:34 crc kubenswrapper[4743]: E0310 15:10:34.622396 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480\": container with ID starting with 77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480 not found: ID does not exist" containerID="77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.622524 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480"} err="failed to get container status \"77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480\": rpc error: code = NotFound desc = could not find container \"77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480\": container with ID starting with 77d02213b1cc93aab52969cbeb75784aa9556255d1994b9b4aeab52af7bea480 not found: ID does not exist" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.622560 4743 scope.go:117] "RemoveContainer" containerID="b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.635494 4743 scope.go:117] "RemoveContainer" containerID="3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.650316 4743 scope.go:117] "RemoveContainer" containerID="b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.663682 4743 scope.go:117] "RemoveContainer" containerID="b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274" Mar 10 15:10:34 crc kubenswrapper[4743]: E0310 15:10:34.664180 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274\": container with ID starting with b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274 not found: ID does not exist" containerID="b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.664228 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274"} err="failed to get container status \"b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274\": rpc error: code = NotFound desc = could not find container \"b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274\": container with ID starting with b01a35ad44d265c9c56a80561824c2a125a027eaa144b62a4a9e00da7e526274 not found: ID does not exist" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.664266 4743 scope.go:117] "RemoveContainer" containerID="3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f" Mar 10 15:10:34 crc kubenswrapper[4743]: E0310 15:10:34.664685 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f\": container with ID starting with 3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f not found: ID does not exist" containerID="3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.664748 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f"} err="failed to get container status \"3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f\": rpc error: code = NotFound desc = could not find container \"3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f\": container with ID starting with 3969652511ac5c4eb8cc7c5b9babfb901347f86f41ced8dc589eb0f66735845f not found: ID does not exist" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.664792 4743 scope.go:117] "RemoveContainer" containerID="b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e" Mar 10 15:10:34 crc kubenswrapper[4743]: E0310 15:10:34.665231 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e\": container with ID starting with b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e not found: ID does not exist" containerID="b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e" Mar 10 15:10:34 crc kubenswrapper[4743]: I0310 15:10:34.665293 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e"} err="failed to get container status \"b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e\": rpc error: code = NotFound desc = could not find container \"b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e\": container with ID starting with b05e389997675f424bc4e822360c09eafb812604620873e8303640385b7d894e not found: ID does not exist" Mar 10 15:10:35 crc kubenswrapper[4743]: I0310 15:10:35.925475 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" path="/var/lib/kubelet/pods/2b8d4a30-a71d-48ca-b702-772f0e08c566/volumes" Mar 10 15:10:35 crc kubenswrapper[4743]: I0310 15:10:35.927048 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" path="/var/lib/kubelet/pods/ec21919b-a512-42f8-b1ce-80498821cb65/volumes" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.020487 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-76544b44f9-mccm9"] Mar 10 15:10:37 crc kubenswrapper[4743]: E0310 15:10:37.020742 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" containerName="extract-content" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.020755 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" containerName="extract-content" Mar 10 15:10:37 crc kubenswrapper[4743]: E0310 15:10:37.020769 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerName="extract-content" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.020777 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerName="extract-content" Mar 10 15:10:37 crc kubenswrapper[4743]: E0310 15:10:37.020788 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9164077f-fddc-43e6-9aac-23a8be818d9f" containerName="oauth-openshift" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.020795 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9164077f-fddc-43e6-9aac-23a8be818d9f" containerName="oauth-openshift" Mar 10 15:10:37 crc kubenswrapper[4743]: E0310 15:10:37.020801 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.020829 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4743]: E0310 15:10:37.020840 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerName="extract-utilities" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.020848 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerName="extract-utilities" Mar 10 15:10:37 crc kubenswrapper[4743]: E0310 15:10:37.020861 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" containerName="extract-utilities" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.020867 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" containerName="extract-utilities" Mar 10 15:10:37 crc kubenswrapper[4743]: E0310 15:10:37.020883 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.020889 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.021000 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec21919b-a512-42f8-b1ce-80498821cb65" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.021018 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9164077f-fddc-43e6-9aac-23a8be818d9f" containerName="oauth-openshift" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.021031 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8d4a30-a71d-48ca-b702-772f0e08c566" containerName="registry-server" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.021509 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.024397 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.025602 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.025988 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.026196 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.026400 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.026489 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.026630 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.029424 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.030242 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.031260 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.031418 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.032852 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.034238 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038068 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-session\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038131 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038159 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-router-certs\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-error\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-audit-policies\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-service-ca\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038903 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qk8m\" (UniqueName: \"kubernetes.io/projected/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-kube-api-access-9qk8m\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.038987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.039058 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-login\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.039137 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-audit-dir\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.062832 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76544b44f9-mccm9"] Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.109672 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-session\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-router-certs\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141326 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-error\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141370 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-audit-policies\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-service-ca\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141486 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141518 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141588 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qk8m\" (UniqueName: \"kubernetes.io/projected/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-kube-api-access-9qk8m\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-login\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141693 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-audit-dir\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.141787 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-audit-dir\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.142503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-audit-policies\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.142549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.142572 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-service-ca\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.148054 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-login\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.148220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-session\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.148225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.148894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-error\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.149550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-router-certs\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.150862 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.158106 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.159747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.161418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.163005 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qk8m\" (UniqueName: \"kubernetes.io/projected/0eba3ac6-837a-4530-bd0f-cd21ac11faaf-kube-api-access-9qk8m\") pod \"oauth-openshift-76544b44f9-mccm9\" (UID: \"0eba3ac6-837a-4530-bd0f-cd21ac11faaf\") " pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.360591 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.603417 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cv2mc"] Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.603879 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cv2mc" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerName="registry-server" containerID="cri-o://49f1f5f54716a4e436784ed8625e48acc50e3bf824860b239c314f1014ec66fc" gracePeriod=2 Mar 10 15:10:37 crc kubenswrapper[4743]: I0310 15:10:37.822705 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76544b44f9-mccm9"] Mar 10 15:10:38 crc kubenswrapper[4743]: I0310 15:10:38.565715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" event={"ID":"0eba3ac6-837a-4530-bd0f-cd21ac11faaf","Type":"ContainerStarted","Data":"a28f79818ffd2720ef5d5f2152070cdafee340cb2b839707205a5eab53e7a37c"} Mar 10 15:10:39 crc kubenswrapper[4743]: I0310 15:10:39.577414 4743 generic.go:334] "Generic (PLEG): container finished" podID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerID="49f1f5f54716a4e436784ed8625e48acc50e3bf824860b239c314f1014ec66fc" exitCode=0 Mar 10 15:10:39 crc kubenswrapper[4743]: I0310 15:10:39.577523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv2mc" event={"ID":"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493","Type":"ContainerDied","Data":"49f1f5f54716a4e436784ed8625e48acc50e3bf824860b239c314f1014ec66fc"} Mar 10 15:10:39 crc kubenswrapper[4743]: I0310 15:10:39.580134 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" event={"ID":"0eba3ac6-837a-4530-bd0f-cd21ac11faaf","Type":"ContainerStarted","Data":"9a6b64d0c375e06401d478f89fe6acf01f391e76b11965988f410f27d5f35ecd"} Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.532382 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.591545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv2mc" event={"ID":"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493","Type":"ContainerDied","Data":"763a304f28c802381bad3b010491352be3204447a7871ea4905bf223c4c2d83c"} Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.591607 4743 scope.go:117] "RemoveContainer" containerID="49f1f5f54716a4e436784ed8625e48acc50e3bf824860b239c314f1014ec66fc" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.591747 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv2mc" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.592331 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.604174 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-utilities\") pod \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.604259 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzxhn\" (UniqueName: \"kubernetes.io/projected/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-kube-api-access-mzxhn\") pod \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.604300 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-catalog-content\") pod \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\" (UID: \"5faf25f9-ab4b-468e-b1e9-71a0d8e9e493\") " Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.605169 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.606290 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-utilities" (OuterVolumeSpecName: "utilities") pod "5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" (UID: "5faf25f9-ab4b-468e-b1e9-71a0d8e9e493"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.616615 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-76544b44f9-mccm9" podStartSLOduration=35.616553755 podStartE2EDuration="35.616553755s" podCreationTimestamp="2026-03-10 15:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:40.613465633 +0000 UTC m=+305.320280391" watchObservedRunningTime="2026-03-10 15:10:40.616553755 +0000 UTC m=+305.323368513" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.623244 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-kube-api-access-mzxhn" (OuterVolumeSpecName: "kube-api-access-mzxhn") pod "5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" (UID: "5faf25f9-ab4b-468e-b1e9-71a0d8e9e493"). InnerVolumeSpecName "kube-api-access-mzxhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.625220 4743 scope.go:117] "RemoveContainer" containerID="9bfdba87ca7e76147a4044cc46eb7fe6b766a43bee86cd91e75edfb2c32f18d6" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.644009 4743 scope.go:117] "RemoveContainer" containerID="409b90acd45605582c37c1e315cffbf440b459ef5a59c628e3f1c62dffd5ebc7" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.709633 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.714095 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzxhn\" (UniqueName: \"kubernetes.io/projected/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-kube-api-access-mzxhn\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.874076 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" (UID: "5faf25f9-ab4b-468e-b1e9-71a0d8e9e493"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.918218 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.920859 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cv2mc"] Mar 10 15:10:40 crc kubenswrapper[4743]: I0310 15:10:40.925746 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cv2mc"] Mar 10 15:10:41 crc kubenswrapper[4743]: I0310 15:10:41.253049 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:10:41 crc kubenswrapper[4743]: I0310 15:10:41.253122 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:10:41 crc kubenswrapper[4743]: I0310 15:10:41.253170 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:10:41 crc kubenswrapper[4743]: I0310 15:10:41.253795 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:10:41 crc kubenswrapper[4743]: I0310 15:10:41.253872 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2" gracePeriod=600 Mar 10 15:10:41 crc kubenswrapper[4743]: I0310 15:10:41.945270 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" path="/var/lib/kubelet/pods/5faf25f9-ab4b-468e-b1e9-71a0d8e9e493/volumes" Mar 10 15:10:42 crc kubenswrapper[4743]: I0310 15:10:42.608443 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2" exitCode=0 Mar 10 15:10:42 crc kubenswrapper[4743]: I0310 15:10:42.608569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2"} Mar 10 15:10:42 crc kubenswrapper[4743]: I0310 15:10:42.609017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"03018d7637364bc245db63b69660e40a53f14d3f7016873be1f03ec6299ce4d5"} Mar 10 15:10:48 crc kubenswrapper[4743]: I0310 15:10:48.584744 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7865496797-txnhr"] Mar 10 15:10:48 crc kubenswrapper[4743]: I0310 15:10:48.585754 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" podUID="4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" containerName="controller-manager" containerID="cri-o://41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851" gracePeriod=30 Mar 10 15:10:48 crc kubenswrapper[4743]: I0310 15:10:48.686079 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b"] Mar 10 15:10:48 crc kubenswrapper[4743]: I0310 15:10:48.686645 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" podUID="675e18a7-71e0-455c-a7a8-a99e10cc6576" containerName="route-controller-manager" containerID="cri-o://2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142" gracePeriod=30 Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.172646 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.179239 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241351 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675e18a7-71e0-455c-a7a8-a99e10cc6576-serving-cert\") pod \"675e18a7-71e0-455c-a7a8-a99e10cc6576\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241440 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-config\") pod \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241483 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-proxy-ca-bundles\") pod \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241526 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4xww\" (UniqueName: \"kubernetes.io/projected/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-kube-api-access-m4xww\") pod \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241546 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5xbv\" (UniqueName: \"kubernetes.io/projected/675e18a7-71e0-455c-a7a8-a99e10cc6576-kube-api-access-c5xbv\") pod \"675e18a7-71e0-455c-a7a8-a99e10cc6576\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241596 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-client-ca\") pod \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-serving-cert\") pod \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\" (UID: \"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241684 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-client-ca\") pod \"675e18a7-71e0-455c-a7a8-a99e10cc6576\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.241698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-config\") pod \"675e18a7-71e0-455c-a7a8-a99e10cc6576\" (UID: \"675e18a7-71e0-455c-a7a8-a99e10cc6576\") " Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.242995 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-config" (OuterVolumeSpecName: "config") pod "675e18a7-71e0-455c-a7a8-a99e10cc6576" (UID: "675e18a7-71e0-455c-a7a8-a99e10cc6576"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.255979 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" (UID: "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.256445 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" (UID: "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.256575 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-config" (OuterVolumeSpecName: "config") pod "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" (UID: "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.257010 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-client-ca" (OuterVolumeSpecName: "client-ca") pod "675e18a7-71e0-455c-a7a8-a99e10cc6576" (UID: "675e18a7-71e0-455c-a7a8-a99e10cc6576"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.259335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-kube-api-access-m4xww" (OuterVolumeSpecName: "kube-api-access-m4xww") pod "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" (UID: "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9"). InnerVolumeSpecName "kube-api-access-m4xww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.259884 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675e18a7-71e0-455c-a7a8-a99e10cc6576-kube-api-access-c5xbv" (OuterVolumeSpecName: "kube-api-access-c5xbv") pod "675e18a7-71e0-455c-a7a8-a99e10cc6576" (UID: "675e18a7-71e0-455c-a7a8-a99e10cc6576"). InnerVolumeSpecName "kube-api-access-c5xbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.260260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/675e18a7-71e0-455c-a7a8-a99e10cc6576-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "675e18a7-71e0-455c-a7a8-a99e10cc6576" (UID: "675e18a7-71e0-455c-a7a8-a99e10cc6576"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.260954 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" (UID: "4c9a24c6-acae-43c2-a1a9-9d1e7297daf9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343748 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343788 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343798 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/675e18a7-71e0-455c-a7a8-a99e10cc6576-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343823 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675e18a7-71e0-455c-a7a8-a99e10cc6576-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343833 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343843 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343856 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4xww\" (UniqueName: \"kubernetes.io/projected/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-kube-api-access-m4xww\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343866 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5xbv\" (UniqueName: \"kubernetes.io/projected/675e18a7-71e0-455c-a7a8-a99e10cc6576-kube-api-access-c5xbv\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.343875 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.651230 4743 generic.go:334] "Generic (PLEG): container finished" podID="4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" containerID="41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851" exitCode=0 Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.651364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" event={"ID":"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9","Type":"ContainerDied","Data":"41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851"} Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.651455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" event={"ID":"4c9a24c6-acae-43c2-a1a9-9d1e7297daf9","Type":"ContainerDied","Data":"85fcbd68c837db79ec22060bb992947753f2bdbb10b6a659515700748209197e"} Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.651515 4743 scope.go:117] "RemoveContainer" containerID="41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.651329 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7865496797-txnhr" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.653379 4743 generic.go:334] "Generic (PLEG): container finished" podID="675e18a7-71e0-455c-a7a8-a99e10cc6576" containerID="2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142" exitCode=0 Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.653432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" event={"ID":"675e18a7-71e0-455c-a7a8-a99e10cc6576","Type":"ContainerDied","Data":"2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142"} Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.653454 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.653471 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b" event={"ID":"675e18a7-71e0-455c-a7a8-a99e10cc6576","Type":"ContainerDied","Data":"a0e34e3b148c44d6af512efc504f98d4e62710299cbb79ecdc3f598e14c5f878"} Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.679500 4743 scope.go:117] "RemoveContainer" containerID="41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851" Mar 10 15:10:49 crc kubenswrapper[4743]: E0310 15:10:49.685036 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851\": container with ID starting with 41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851 not found: ID does not exist" containerID="41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.685097 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851"} err="failed to get container status \"41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851\": rpc error: code = NotFound desc = could not find container \"41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851\": container with ID starting with 41501df1712bac8f0c5b86d5454351564a0bdf091604aa50b9e12fa44de9a851 not found: ID does not exist" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.685129 4743 scope.go:117] "RemoveContainer" containerID="2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.710130 4743 scope.go:117] "RemoveContainer" containerID="2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.714891 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b"] Mar 10 15:10:49 crc kubenswrapper[4743]: E0310 15:10:49.716066 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142\": container with ID starting with 2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142 not found: ID does not exist" containerID="2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.716143 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142"} err="failed to get container status \"2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142\": rpc error: code = NotFound desc = could not find container \"2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142\": container with ID starting with 2179332ed9a53425164c3ae8452d253a0290bde461a0c80e82d5825b58a26142 not found: ID does not exist" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.717767 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588c99c9b4-f8k8b"] Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.751112 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7865496797-txnhr"] Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.754041 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7865496797-txnhr"] Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.922435 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" path="/var/lib/kubelet/pods/4c9a24c6-acae-43c2-a1a9-9d1e7297daf9/volumes" Mar 10 15:10:49 crc kubenswrapper[4743]: I0310 15:10:49.923039 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675e18a7-71e0-455c-a7a8-a99e10cc6576" path="/var/lib/kubelet/pods/675e18a7-71e0-455c-a7a8-a99e10cc6576/volumes" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.029994 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66998f7cd-wm472"] Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.030326 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" containerName="controller-manager" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.030353 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" containerName="controller-manager" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.030370 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675e18a7-71e0-455c-a7a8-a99e10cc6576" containerName="route-controller-manager" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.030380 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="675e18a7-71e0-455c-a7a8-a99e10cc6576" containerName="route-controller-manager" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.030401 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerName="extract-utilities" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.030410 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerName="extract-utilities" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.030426 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerName="registry-server" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.030434 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerName="registry-server" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.030446 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerName="extract-content" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.030455 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerName="extract-content" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.030904 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9a24c6-acae-43c2-a1a9-9d1e7297daf9" containerName="controller-manager" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.030923 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5faf25f9-ab4b-468e-b1e9-71a0d8e9e493" containerName="registry-server" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.030934 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="675e18a7-71e0-455c-a7a8-a99e10cc6576" containerName="route-controller-manager" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.031527 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.034290 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.034290 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.034549 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.034663 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.034924 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.035775 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.036645 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x"] Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.038709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.040394 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.048185 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.048433 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.048832 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.048997 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.049237 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.049430 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.051576 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x"] Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.055104 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4365987-5520-4a78-94bb-0f2308b544f7-serving-cert\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.055177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-proxy-ca-bundles\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.055211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-client-ca\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.055238 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn975\" (UniqueName: \"kubernetes.io/projected/c4365987-5520-4a78-94bb-0f2308b544f7-kube-api-access-nn975\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.055364 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-config\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.062456 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66998f7cd-wm472"] Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156263 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-proxy-ca-bundles\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156323 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn975\" (UniqueName: \"kubernetes.io/projected/c4365987-5520-4a78-94bb-0f2308b544f7-kube-api-access-nn975\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-client-ca\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f10d973-c107-4a86-bd97-dac12b7dd7b0-client-ca\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-config\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4365987-5520-4a78-94bb-0f2308b544f7-serving-cert\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156535 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/4f10d973-c107-4a86-bd97-dac12b7dd7b0-kube-api-access-sjc7t\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f10d973-c107-4a86-bd97-dac12b7dd7b0-config\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.156581 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f10d973-c107-4a86-bd97-dac12b7dd7b0-serving-cert\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.157923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-config\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.157916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-proxy-ca-bundles\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.158470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4365987-5520-4a78-94bb-0f2308b544f7-client-ca\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.163796 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4365987-5520-4a78-94bb-0f2308b544f7-serving-cert\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.181700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn975\" (UniqueName: \"kubernetes.io/projected/c4365987-5520-4a78-94bb-0f2308b544f7-kube-api-access-nn975\") pod \"controller-manager-66998f7cd-wm472\" (UID: \"c4365987-5520-4a78-94bb-0f2308b544f7\") " pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.258343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/4f10d973-c107-4a86-bd97-dac12b7dd7b0-kube-api-access-sjc7t\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.258404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f10d973-c107-4a86-bd97-dac12b7dd7b0-config\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.258430 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f10d973-c107-4a86-bd97-dac12b7dd7b0-serving-cert\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.258497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f10d973-c107-4a86-bd97-dac12b7dd7b0-client-ca\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.259681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f10d973-c107-4a86-bd97-dac12b7dd7b0-config\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.259985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f10d973-c107-4a86-bd97-dac12b7dd7b0-client-ca\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.262293 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f10d973-c107-4a86-bd97-dac12b7dd7b0-serving-cert\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.288407 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.288643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/4f10d973-c107-4a86-bd97-dac12b7dd7b0-kube-api-access-sjc7t\") pod \"route-controller-manager-5c4b8bd597-5pd2x\" (UID: \"4f10d973-c107-4a86-bd97-dac12b7dd7b0\") " pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.290196 4743 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.290376 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.290449 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291001 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287" gracePeriod=15 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291063 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f" gracePeriod=15 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291103 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed" gracePeriod=15 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291039 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c" gracePeriod=15 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291062 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636" gracePeriod=15 Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291195 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291230 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291241 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291249 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291260 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291267 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291275 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291281 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291290 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291296 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291306 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291312 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291323 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291329 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291339 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291347 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291453 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291464 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291477 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291484 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291492 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291502 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291515 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291606 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291615 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.291624 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291631 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291782 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.291794 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.301424 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.357778 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.359248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.359278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.359308 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.359325 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.359348 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.359373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.359400 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.359420 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.366213 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.460742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.460821 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.460857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.460890 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.460938 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.460959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461031 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461262 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461287 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.461336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.669144 4743 generic.go:334] "Generic (PLEG): container finished" podID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" containerID="05827689b49048499d1f79cee15a306e3ea075f4a14b58c5f917eb8326354106" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.669205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1","Type":"ContainerDied","Data":"05827689b49048499d1f79cee15a306e3ea075f4a14b58c5f917eb8326354106"} Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.671124 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.674969 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.677148 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.678006 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.678035 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.678049 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.678065 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636" exitCode=2 Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.678123 4743 scope.go:117] "RemoveContainer" containerID="9ec6cbd8246d0051c00dcfd7706b71caa95048686f564b366b52c162053f7fba" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.814608 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.816013 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.816349 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.816686 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.817025 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:50 crc kubenswrapper[4743]: I0310 15:10:50.817058 4743 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 15:10:50 crc kubenswrapper[4743]: E0310 15:10:50.817380 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="200ms" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.004121 4743 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:10:51 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf" Netns:"/var/run/netns/ea0c0757-753f-468a-a467-2de9b175ab7a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:51 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:51 crc kubenswrapper[4743]: > Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.004229 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:10:51 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf" Netns:"/var/run/netns/ea0c0757-753f-468a-a467-2de9b175ab7a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:51 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:51 crc kubenswrapper[4743]: > pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.004278 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 15:10:51 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf" Netns:"/var/run/netns/ea0c0757-753f-468a-a467-2de9b175ab7a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:51 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:51 crc kubenswrapper[4743]: > pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.004368 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-66998f7cd-wm472_openshift-controller-manager(c4365987-5520-4a78-94bb-0f2308b544f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-66998f7cd-wm472_openshift-controller-manager(c4365987-5520-4a78-94bb-0f2308b544f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf\\\" Netns:\\\"/var/run/netns/ea0c0757-753f-468a-a467-2de9b175ab7a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s\\\": dial tcp 38.102.83.115:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" podUID="c4365987-5520-4a78-94bb-0f2308b544f7" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.005095 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.115:6443: connect: connection refused" event=< Mar 10 15:10:51 crc kubenswrapper[4743]: &Event{ObjectMeta:{controller-manager-66998f7cd-wm472.189b837cf4a8b2f6 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-66998f7cd-wm472,UID:c4365987-5520-4a78-94bb-0f2308b544f7,APIVersion:v1,ResourceVersion:30020,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf" Netns:"/var/run/netns/ea0c0757-753f-468a-a467-2de9b175ab7a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:51 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:10:51.00430207 +0000 UTC m=+315.711116818,LastTimestamp:2026-03-10 15:10:51.00430207 +0000 UTC m=+315.711116818,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:10:51 crc kubenswrapper[4743]: > Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.018336 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="400ms" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.073920 4743 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:10:51 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031" Netns:"/var/run/netns/67099d97-104c-49f1-995f-6c1691dece48" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:51 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:51 crc kubenswrapper[4743]: > Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.074035 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:10:51 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031" Netns:"/var/run/netns/67099d97-104c-49f1-995f-6c1691dece48" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:51 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:51 crc kubenswrapper[4743]: > pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.074069 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 15:10:51 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031" Netns:"/var/run/netns/67099d97-104c-49f1-995f-6c1691dece48" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:51 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:51 crc kubenswrapper[4743]: > pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.074149 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager(4f10d973-c107-4a86-bd97-dac12b7dd7b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager(4f10d973-c107-4a86-bd97-dac12b7dd7b0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031\\\" Netns:\\\"/var/run/netns/67099d97-104c-49f1-995f-6c1691dece48\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=468f6a8f93d58cd492f3b02a78be611c9e2128cdce52436fa2d9a5cbcb932031;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s\\\": dial tcp 38.102.83.115:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" podUID="4f10d973-c107-4a86-bd97-dac12b7dd7b0" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.346432 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:10:51Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:10:51Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:10:51Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:10:51Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.347119 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.347548 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.347986 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.348607 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.348638 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:10:51 crc kubenswrapper[4743]: E0310 15:10:51.420352 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="800ms" Mar 10 15:10:51 crc kubenswrapper[4743]: I0310 15:10:51.706609 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:10:51 crc kubenswrapper[4743]: I0310 15:10:51.708356 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:51 crc kubenswrapper[4743]: I0310 15:10:51.708429 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:51 crc kubenswrapper[4743]: I0310 15:10:51.709196 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:51 crc kubenswrapper[4743]: I0310 15:10:51.709357 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:51 crc kubenswrapper[4743]: I0310 15:10:51.975921 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:51 crc kubenswrapper[4743]: I0310 15:10:51.977261 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.088771 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kube-api-access\") pod \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.088860 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-var-lock\") pod \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.088929 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kubelet-dir\") pod \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\" (UID: \"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1\") " Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.089046 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-var-lock" (OuterVolumeSpecName: "var-lock") pod "cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" (UID: "cbaf72e5-0721-41a6-957c-28ce7dbb7ff1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.089122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" (UID: "cbaf72e5-0721-41a6-957c-28ce7dbb7ff1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.089337 4743 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.089356 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.096924 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" (UID: "cbaf72e5-0721-41a6-957c-28ce7dbb7ff1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.191018 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbaf72e5-0721-41a6-957c-28ce7dbb7ff1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.221741 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="1.6s" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.395535 4743 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:10:52 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f" Netns:"/var/run/netns/6c0b216c-9cf7-43af-bc49-09b9b23aa9f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:52 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:52 crc kubenswrapper[4743]: > Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.397581 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:10:52 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f" Netns:"/var/run/netns/6c0b216c-9cf7-43af-bc49-09b9b23aa9f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:52 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:52 crc kubenswrapper[4743]: > pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.397612 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 15:10:52 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f" Netns:"/var/run/netns/6c0b216c-9cf7-43af-bc49-09b9b23aa9f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:52 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:52 crc kubenswrapper[4743]: > pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.397688 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-66998f7cd-wm472_openshift-controller-manager(c4365987-5520-4a78-94bb-0f2308b544f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-66998f7cd-wm472_openshift-controller-manager(c4365987-5520-4a78-94bb-0f2308b544f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f\\\" Netns:\\\"/var/run/netns/6c0b216c-9cf7-43af-bc49-09b9b23aa9f5\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=2e07cd9b53e25acc15a3385489b1ae0ebff0e31bb9b8240aeb9ab5621b18261f;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s\\\": dial tcp 38.102.83.115:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" podUID="c4365987-5520-4a78-94bb-0f2308b544f7" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.439282 4743 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:10:52 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e" Netns:"/var/run/netns/d5ecb222-a002-496e-95d2-9c295002196e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:52 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:52 crc kubenswrapper[4743]: > Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.439384 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:10:52 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e" Netns:"/var/run/netns/d5ecb222-a002-496e-95d2-9c295002196e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:52 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:52 crc kubenswrapper[4743]: > pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.439408 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 15:10:52 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e" Netns:"/var/run/netns/d5ecb222-a002-496e-95d2-9c295002196e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:52 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:10:52 crc kubenswrapper[4743]: > pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.440102 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager(4f10d973-c107-4a86-bd97-dac12b7dd7b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager(4f10d973-c107-4a86-bd97-dac12b7dd7b0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e\\\" Netns:\\\"/var/run/netns/d5ecb222-a002-496e-95d2-9c295002196e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=0686676392dfceb04846f5c9ede5f29f4076d964f5d7c945d943369f5120fa1e;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s\\\": dial tcp 38.102.83.115:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" podUID="4f10d973-c107-4a86-bd97-dac12b7dd7b0" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.647470 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.648590 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.649541 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.650130 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.699928 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.700048 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.700119 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.700184 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.700381 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.700486 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.701180 4743 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.701223 4743 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.701238 4743 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.715978 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cbaf72e5-0721-41a6-957c-28ce7dbb7ff1","Type":"ContainerDied","Data":"d3ea96daa2fea20a6ab7c493e127599308a06d5f398b80068e55e1c5993e016e"} Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.716022 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.716038 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3ea96daa2fea20a6ab7c493e127599308a06d5f398b80068e55e1c5993e016e" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.719174 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.720013 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed" exitCode=0 Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.720096 4743 scope.go:117] "RemoveContainer" containerID="5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.720206 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.754008 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.754671 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.755521 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.756065 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.758170 4743 scope.go:117] "RemoveContainer" containerID="c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.770775 4743 scope.go:117] "RemoveContainer" containerID="29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.785317 4743 scope.go:117] "RemoveContainer" containerID="50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.800883 4743 scope.go:117] "RemoveContainer" containerID="f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.816524 4743 scope.go:117] "RemoveContainer" containerID="7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.837764 4743 scope.go:117] "RemoveContainer" containerID="5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.838431 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\": container with ID starting with 5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c not found: ID does not exist" containerID="5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.838468 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c"} err="failed to get container status \"5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\": rpc error: code = NotFound desc = could not find container \"5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c\": container with ID starting with 5746ae865a2e0c3d8b51ddef3fea2ef59034cb07ece654d176e445e961a9158c not found: ID does not exist" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.838502 4743 scope.go:117] "RemoveContainer" containerID="c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.838948 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\": container with ID starting with c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287 not found: ID does not exist" containerID="c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.838976 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287"} err="failed to get container status \"c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\": rpc error: code = NotFound desc = could not find container \"c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287\": container with ID starting with c3b0840f6304bd39e4c626639636d3fb0aa7f73fcd53f9300872afe2fa669287 not found: ID does not exist" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.839010 4743 scope.go:117] "RemoveContainer" containerID="29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.839343 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\": container with ID starting with 29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f not found: ID does not exist" containerID="29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.839373 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f"} err="failed to get container status \"29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\": rpc error: code = NotFound desc = could not find container \"29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f\": container with ID starting with 29c740de7f9496acbaac94595e8ae1b3dfd04ae083f3adead489175815e4195f not found: ID does not exist" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.839404 4743 scope.go:117] "RemoveContainer" containerID="50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.839693 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\": container with ID starting with 50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636 not found: ID does not exist" containerID="50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.839719 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636"} err="failed to get container status \"50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\": rpc error: code = NotFound desc = could not find container \"50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636\": container with ID starting with 50a0184b8174c980767c7c4b3d81f012deed553adbc13e99c1181af9a0d8e636 not found: ID does not exist" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.839736 4743 scope.go:117] "RemoveContainer" containerID="f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.839979 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\": container with ID starting with f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed not found: ID does not exist" containerID="f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.840000 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed"} err="failed to get container status \"f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\": rpc error: code = NotFound desc = could not find container \"f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed\": container with ID starting with f198e4a069595ab1ea70b51dc628f8135a5800b96dc87e83dea5899337aacfed not found: ID does not exist" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.840016 4743 scope.go:117] "RemoveContainer" containerID="7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f" Mar 10 15:10:52 crc kubenswrapper[4743]: E0310 15:10:52.840274 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\": container with ID starting with 7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f not found: ID does not exist" containerID="7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f" Mar 10 15:10:52 crc kubenswrapper[4743]: I0310 15:10:52.840299 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f"} err="failed to get container status \"7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\": rpc error: code = NotFound desc = could not find container \"7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f\": container with ID starting with 7ff38475a22f43246c423669078697e68a443efbf5e5e5998bfa8d3f52bf076f not found: ID does not exist" Mar 10 15:10:53 crc kubenswrapper[4743]: E0310 15:10:53.822585 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="3.2s" Mar 10 15:10:53 crc kubenswrapper[4743]: I0310 15:10:53.923032 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 15:10:55 crc kubenswrapper[4743]: E0310 15:10:55.342246 4743 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.115:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:55 crc kubenswrapper[4743]: I0310 15:10:55.343106 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:55 crc kubenswrapper[4743]: I0310 15:10:55.747863 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a"} Mar 10 15:10:55 crc kubenswrapper[4743]: I0310 15:10:55.748448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c15fd7cd747ba6e596272ed515f13338a0d835fdb9499b193d34cbb9e855cd4c"} Mar 10 15:10:55 crc kubenswrapper[4743]: I0310 15:10:55.749393 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:55 crc kubenswrapper[4743]: E0310 15:10:55.749481 4743 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.115:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:10:55 crc kubenswrapper[4743]: I0310 15:10:55.922865 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:10:57 crc kubenswrapper[4743]: E0310 15:10:57.025895 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="6.4s" Mar 10 15:10:57 crc kubenswrapper[4743]: E0310 15:10:57.733677 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.115:6443: connect: connection refused" event=< Mar 10 15:10:57 crc kubenswrapper[4743]: &Event{ObjectMeta:{controller-manager-66998f7cd-wm472.189b837cf4a8b2f6 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-66998f7cd-wm472,UID:c4365987-5520-4a78-94bb-0f2308b544f7,APIVersion:v1,ResourceVersion:30020,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf" Netns:"/var/run/netns/ea0c0757-753f-468a-a467-2de9b175ab7a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=0a4c6133f9987ea794385a3c51362bc1b15db3d4c603c386df5ae818f26c5faf;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:57 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:10:51.00430207 +0000 UTC m=+315.711116818,LastTimestamp:2026-03-10 15:10:51.00430207 +0000 UTC m=+315.711116818,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:10:57 crc kubenswrapper[4743]: > Mar 10 15:10:59 crc kubenswrapper[4743]: I0310 15:10:59.205510 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:10:59 crc kubenswrapper[4743]: I0310 15:10:59.205642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:10:59 crc kubenswrapper[4743]: W0310 15:10:59.206301 4743 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:59 crc kubenswrapper[4743]: E0310 15:10:59.206390 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:10:59 crc kubenswrapper[4743]: W0310 15:10:59.206311 4743 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27263": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:59 crc kubenswrapper[4743]: E0310 15:10:59.206428 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27263\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:10:59 crc kubenswrapper[4743]: I0310 15:10:59.306391 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:10:59 crc kubenswrapper[4743]: I0310 15:10:59.306444 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:10:59 crc kubenswrapper[4743]: W0310 15:10:59.307130 4743 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:10:59 crc kubenswrapper[4743]: E0310 15:10:59.307221 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:00 crc kubenswrapper[4743]: E0310 15:11:00.206747 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:11:00 crc kubenswrapper[4743]: E0310 15:11:00.206833 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:00 crc kubenswrapper[4743]: E0310 15:11:00.206872 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:13:02.206847173 +0000 UTC m=+446.913661921 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:11:00 crc kubenswrapper[4743]: E0310 15:11:00.206931 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:13:02.206908045 +0000 UTC m=+446.913722793 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:00 crc kubenswrapper[4743]: E0310 15:11:00.307625 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:00 crc kubenswrapper[4743]: E0310 15:11:00.307667 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:00 crc kubenswrapper[4743]: W0310 15:11:00.308318 4743 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:00 crc kubenswrapper[4743]: E0310 15:11:00.308429 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:01 crc kubenswrapper[4743]: E0310 15:11:01.308693 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:01 crc kubenswrapper[4743]: E0310 15:11:01.308742 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:01 crc kubenswrapper[4743]: E0310 15:11:01.308739 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:01 crc kubenswrapper[4743]: E0310 15:11:01.308789 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:01 crc kubenswrapper[4743]: E0310 15:11:01.308839 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:13:03.308799824 +0000 UTC m=+448.015614572 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:01 crc kubenswrapper[4743]: E0310 15:11:01.308903 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:13:03.308873016 +0000 UTC m=+448.015687794 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:11:01 crc kubenswrapper[4743]: W0310 15:11:01.592935 4743 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:01 crc kubenswrapper[4743]: E0310 15:11:01.593033 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:01 crc kubenswrapper[4743]: W0310 15:11:01.605488 4743 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27263": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:01 crc kubenswrapper[4743]: E0310 15:11:01.605583 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27263\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:02 crc kubenswrapper[4743]: W0310 15:11:02.115436 4743 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:02 crc kubenswrapper[4743]: E0310 15:11:02.115975 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:02 crc kubenswrapper[4743]: I0310 15:11:02.796886 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:11:02 crc kubenswrapper[4743]: I0310 15:11:02.797667 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 15:11:02 crc kubenswrapper[4743]: I0310 15:11:02.797735 4743 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3" exitCode=1 Mar 10 15:11:02 crc kubenswrapper[4743]: I0310 15:11:02.797780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3"} Mar 10 15:11:02 crc kubenswrapper[4743]: I0310 15:11:02.798527 4743 scope.go:117] "RemoveContainer" containerID="74cc60f73735360080aab3817aa1cb9eabc6a14062a5e1a9d896fa7ad5d6bba3" Mar 10 15:11:02 crc kubenswrapper[4743]: I0310 15:11:02.798973 4743 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:02 crc kubenswrapper[4743]: I0310 15:11:02.799583 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:03 crc kubenswrapper[4743]: W0310 15:11:03.036716 4743 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:03 crc kubenswrapper[4743]: E0310 15:11:03.037303 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:03 crc kubenswrapper[4743]: I0310 15:11:03.146633 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:11:03 crc kubenswrapper[4743]: E0310 15:11:03.428451 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.115:6443: connect: connection refused" interval="7s" Mar 10 15:11:03 crc kubenswrapper[4743]: I0310 15:11:03.809108 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:11:03 crc kubenswrapper[4743]: I0310 15:11:03.809780 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 15:11:03 crc kubenswrapper[4743]: I0310 15:11:03.809879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"89abe24b550c274a4a1f9f3f5c07d15ed217480d76e49b36d43cada8d812d748"} Mar 10 15:11:03 crc kubenswrapper[4743]: I0310 15:11:03.810856 4743 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:03 crc kubenswrapper[4743]: I0310 15:11:03.811524 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:04 crc kubenswrapper[4743]: I0310 15:11:04.914968 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:11:04 crc kubenswrapper[4743]: I0310 15:11:04.915415 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:11:05 crc kubenswrapper[4743]: W0310 15:11:05.335448 4743 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:05 crc kubenswrapper[4743]: E0310 15:11:05.335975 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:05 crc kubenswrapper[4743]: W0310 15:11:05.499563 4743 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:05 crc kubenswrapper[4743]: E0310 15:11:05.499756 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:05 crc kubenswrapper[4743]: E0310 15:11:05.529259 4743 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:11:05 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194" Netns:"/var/run/netns/fd158e4b-86fa-44bd-9b38-399312084306" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:05 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:05 crc kubenswrapper[4743]: > Mar 10 15:11:05 crc kubenswrapper[4743]: E0310 15:11:05.529342 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:11:05 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194" Netns:"/var/run/netns/fd158e4b-86fa-44bd-9b38-399312084306" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:05 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:05 crc kubenswrapper[4743]: > pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:11:05 crc kubenswrapper[4743]: E0310 15:11:05.529366 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 15:11:05 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194" Netns:"/var/run/netns/fd158e4b-86fa-44bd-9b38-399312084306" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:05 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:05 crc kubenswrapper[4743]: > pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:11:05 crc kubenswrapper[4743]: E0310 15:11:05.529448 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager(4f10d973-c107-4a86-bd97-dac12b7dd7b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager(4f10d973-c107-4a86-bd97-dac12b7dd7b0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-5c4b8bd597-5pd2x_openshift-route-controller-manager_4f10d973-c107-4a86-bd97-dac12b7dd7b0_0(98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194): error adding pod openshift-route-controller-manager_route-controller-manager-5c4b8bd597-5pd2x to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194\\\" Netns:\\\"/var/run/netns/fd158e4b-86fa-44bd-9b38-399312084306\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-5c4b8bd597-5pd2x;K8S_POD_INFRA_CONTAINER_ID=98c75305297ae6a32eda126cab172a34b20c759bcf1c569e0be6bbe919b71194;K8S_POD_UID=4f10d973-c107-4a86-bd97-dac12b7dd7b0\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x] networking: Multus: [openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x/4f10d973-c107-4a86-bd97-dac12b7dd7b0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-5c4b8bd597-5pd2x in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c4b8bd597-5pd2x?timeout=1m0s\\\": dial tcp 38.102.83.115:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" podUID="4f10d973-c107-4a86-bd97-dac12b7dd7b0" Mar 10 15:11:05 crc kubenswrapper[4743]: W0310 15:11:05.876468 4743 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27263": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:05 crc kubenswrapper[4743]: E0310 15:11:05.876609 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27263\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.914769 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.921226 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.921700 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.923180 4743 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.924005 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.925176 4743 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.926254 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.998415 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:05 crc kubenswrapper[4743]: I0310 15:11:05.998459 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:05 crc kubenswrapper[4743]: E0310 15:11:05.999124 4743 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:06 crc kubenswrapper[4743]: I0310 15:11:06.000357 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:06 crc kubenswrapper[4743]: E0310 15:11:06.006003 4743 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.115:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" volumeName="registry-storage" Mar 10 15:11:06 crc kubenswrapper[4743]: W0310 15:11:06.039483 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-95431d6b84739218b164fe768f345c51e54bf9ab4864633778b2aabe8f3196ff WatchSource:0}: Error finding container 95431d6b84739218b164fe768f345c51e54bf9ab4864633778b2aabe8f3196ff: Status 404 returned error can't find the container with id 95431d6b84739218b164fe768f345c51e54bf9ab4864633778b2aabe8f3196ff Mar 10 15:11:06 crc kubenswrapper[4743]: E0310 15:11:06.573491 4743 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:11:06 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9" Netns:"/var/run/netns/4fa69a46-7b17-48c8-a744-3f030b90b1c7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:06 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:06 crc kubenswrapper[4743]: > Mar 10 15:11:06 crc kubenswrapper[4743]: E0310 15:11:06.574080 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:11:06 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9" Netns:"/var/run/netns/4fa69a46-7b17-48c8-a744-3f030b90b1c7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:06 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:06 crc kubenswrapper[4743]: > pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:11:06 crc kubenswrapper[4743]: E0310 15:11:06.574103 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 15:11:06 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9" Netns:"/var/run/netns/4fa69a46-7b17-48c8-a744-3f030b90b1c7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:06 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:06 crc kubenswrapper[4743]: > pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:11:06 crc kubenswrapper[4743]: E0310 15:11:06.574182 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-66998f7cd-wm472_openshift-controller-manager(c4365987-5520-4a78-94bb-0f2308b544f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-66998f7cd-wm472_openshift-controller-manager(c4365987-5520-4a78-94bb-0f2308b544f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-66998f7cd-wm472_openshift-controller-manager_c4365987-5520-4a78-94bb-0f2308b544f7_0(63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9): error adding pod openshift-controller-manager_controller-manager-66998f7cd-wm472 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9\\\" Netns:\\\"/var/run/netns/4fa69a46-7b17-48c8-a744-3f030b90b1c7\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-66998f7cd-wm472;K8S_POD_INFRA_CONTAINER_ID=63435b9c838e85bd79b4104ae91437144d991ae6b39b9edece683c1c3d5ecaa9;K8S_POD_UID=c4365987-5520-4a78-94bb-0f2308b544f7\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-66998f7cd-wm472] networking: Multus: [openshift-controller-manager/controller-manager-66998f7cd-wm472/c4365987-5520-4a78-94bb-0f2308b544f7]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-66998f7cd-wm472 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-66998f7cd-wm472?timeout=1m0s\\\": dial tcp 38.102.83.115:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" podUID="c4365987-5520-4a78-94bb-0f2308b544f7" Mar 10 15:11:06 crc kubenswrapper[4743]: I0310 15:11:06.828911 4743 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e47d7a0888ba6c832bb85e70749d4cc2acf949c730949795729e0a84dd2c9e8e" exitCode=0 Mar 10 15:11:06 crc kubenswrapper[4743]: I0310 15:11:06.829020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e47d7a0888ba6c832bb85e70749d4cc2acf949c730949795729e0a84dd2c9e8e"} Mar 10 15:11:06 crc kubenswrapper[4743]: I0310 15:11:06.829360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"95431d6b84739218b164fe768f345c51e54bf9ab4864633778b2aabe8f3196ff"} Mar 10 15:11:06 crc kubenswrapper[4743]: I0310 15:11:06.829639 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:06 crc kubenswrapper[4743]: I0310 15:11:06.829651 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:06 crc kubenswrapper[4743]: E0310 15:11:06.830133 4743 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:06 crc kubenswrapper[4743]: I0310 15:11:06.830698 4743 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:06 crc kubenswrapper[4743]: I0310 15:11:06.831248 4743 status_manager.go:851] "Failed to get status for pod" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.115:6443: connect: connection refused" Mar 10 15:11:06 crc kubenswrapper[4743]: W0310 15:11:06.948732 4743 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27262": dial tcp 38.102.83.115:6443: connect: connection refused Mar 10 15:11:06 crc kubenswrapper[4743]: E0310 15:11:06.948863 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27262\": dial tcp 38.102.83.115:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:11:07 crc kubenswrapper[4743]: I0310 15:11:07.838988 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ee860a71d866a55f8025e57f61ae38bc1b1b1b3dc4b63e506a717ac1f3fe5920"} Mar 10 15:11:07 crc kubenswrapper[4743]: I0310 15:11:07.839050 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a985a91f17893d47b4624f64c9ef7bb211164e5c7e30262a0751f00c53a0f4b2"} Mar 10 15:11:07 crc kubenswrapper[4743]: I0310 15:11:07.839064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0279ac5cd2bf0d7893c4c6cd750afb3ce4148425475ae6f37af8c2604b29f867"} Mar 10 15:11:07 crc kubenswrapper[4743]: I0310 15:11:07.839077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1193ca63b1e6126ee3f99518639559e0fd9bd78d873c4b6a1d420c56b8202a93"} Mar 10 15:11:08 crc kubenswrapper[4743]: I0310 15:11:08.848515 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0ceffff97abce2a3b334b2ffc2a041471f383801e9bba21c7079358508a3ac40"} Mar 10 15:11:08 crc kubenswrapper[4743]: I0310 15:11:08.849071 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:08 crc kubenswrapper[4743]: I0310 15:11:08.848901 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:08 crc kubenswrapper[4743]: I0310 15:11:08.849114 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:09 crc kubenswrapper[4743]: I0310 15:11:09.747744 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:11:11 crc kubenswrapper[4743]: I0310 15:11:11.000933 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:11 crc kubenswrapper[4743]: I0310 15:11:11.001377 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:11 crc kubenswrapper[4743]: I0310 15:11:11.007875 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:12 crc kubenswrapper[4743]: I0310 15:11:12.729437 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 15:11:12 crc kubenswrapper[4743]: I0310 15:11:12.881581 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 15:11:13 crc kubenswrapper[4743]: I0310 15:11:13.146904 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:11:13 crc kubenswrapper[4743]: I0310 15:11:13.150969 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:11:13 crc kubenswrapper[4743]: I0310 15:11:13.866071 4743 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:13 crc kubenswrapper[4743]: I0310 15:11:13.889248 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:11:14 crc kubenswrapper[4743]: I0310 15:11:14.675196 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 15:11:14 crc kubenswrapper[4743]: I0310 15:11:14.887005 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:14 crc kubenswrapper[4743]: I0310 15:11:14.887972 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:14 crc kubenswrapper[4743]: I0310 15:11:14.893620 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:15 crc kubenswrapper[4743]: I0310 15:11:15.893309 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:15 crc kubenswrapper[4743]: I0310 15:11:15.893349 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f090627-57d4-471b-810a-540b21da2e8a" Mar 10 15:11:15 crc kubenswrapper[4743]: E0310 15:11:15.934906 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:11:15 crc kubenswrapper[4743]: I0310 15:11:15.939942 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ddc7cd5f-2b8f-48b4-944a-608279cbb6a5" Mar 10 15:11:15 crc kubenswrapper[4743]: E0310 15:11:15.943361 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:11:15 crc kubenswrapper[4743]: E0310 15:11:15.952528 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:11:16 crc kubenswrapper[4743]: I0310 15:11:16.914485 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:11:16 crc kubenswrapper[4743]: I0310 15:11:16.915456 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:11:16 crc kubenswrapper[4743]: I0310 15:11:16.994248 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 15:11:17 crc kubenswrapper[4743]: I0310 15:11:17.912395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" event={"ID":"c4365987-5520-4a78-94bb-0f2308b544f7","Type":"ContainerStarted","Data":"52d9a132f00016868e4b7dc20fbbfda05ad6f8dc919fad3d9b515d1ee04073b9"} Mar 10 15:11:17 crc kubenswrapper[4743]: I0310 15:11:17.912953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" event={"ID":"c4365987-5520-4a78-94bb-0f2308b544f7","Type":"ContainerStarted","Data":"a171aa56b06158ef0083aca8f89787947c9067d0b658e7e3dc68586bd42a6ced"} Mar 10 15:11:17 crc kubenswrapper[4743]: I0310 15:11:17.913655 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:11:17 crc kubenswrapper[4743]: I0310 15:11:17.914942 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:11:17 crc kubenswrapper[4743]: I0310 15:11:17.915303 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:11:17 crc kubenswrapper[4743]: I0310 15:11:17.923992 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" Mar 10 15:11:18 crc kubenswrapper[4743]: W0310 15:11:18.224751 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f10d973_c107_4a86_bd97_dac12b7dd7b0.slice/crio-4f3f6a5a19891e89e994470b5b9d32971b474de7466697a8ed4dcfb4393bda7d WatchSource:0}: Error finding container 4f3f6a5a19891e89e994470b5b9d32971b474de7466697a8ed4dcfb4393bda7d: Status 404 returned error can't find the container with id 4f3f6a5a19891e89e994470b5b9d32971b474de7466697a8ed4dcfb4393bda7d Mar 10 15:11:18 crc kubenswrapper[4743]: I0310 15:11:18.919333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" event={"ID":"4f10d973-c107-4a86-bd97-dac12b7dd7b0","Type":"ContainerStarted","Data":"859a6ee116816cd8ebaa6379682bcf05a9642730f834bdabafdf0235f3f62a97"} Mar 10 15:11:18 crc kubenswrapper[4743]: I0310 15:11:18.919385 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" event={"ID":"4f10d973-c107-4a86-bd97-dac12b7dd7b0","Type":"ContainerStarted","Data":"4f3f6a5a19891e89e994470b5b9d32971b474de7466697a8ed4dcfb4393bda7d"} Mar 10 15:11:20 crc kubenswrapper[4743]: I0310 15:11:20.367212 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:11:21 crc kubenswrapper[4743]: I0310 15:11:21.367183 4743 patch_prober.go:28] interesting pod/route-controller-manager-5c4b8bd597-5pd2x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:11:21 crc kubenswrapper[4743]: I0310 15:11:21.367297 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" podUID="4f10d973-c107-4a86-bd97-dac12b7dd7b0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:11:22 crc kubenswrapper[4743]: I0310 15:11:22.367686 4743 patch_prober.go:28] interesting pod/route-controller-manager-5c4b8bd597-5pd2x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:11:22 crc kubenswrapper[4743]: I0310 15:11:22.367785 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" podUID="4f10d973-c107-4a86-bd97-dac12b7dd7b0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.011959 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.528582 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.528980 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.529165 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.529343 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.529483 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.571459 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.627398 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.677284 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.726791 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.846910 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 15:11:25 crc kubenswrapper[4743]: I0310 15:11:25.922799 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.036070 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.044930 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.053239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.080468 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.121703 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.175633 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.392377 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.600141 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.705034 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.880622 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 15:11:26 crc kubenswrapper[4743]: I0310 15:11:26.914464 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.038742 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.092715 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.112560 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.145701 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.310299 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.311139 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.314949 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.399006 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.414026 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.455861 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.484760 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.492550 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.511976 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.564143 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.586180 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.713313 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.783754 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.792418 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.802766 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.914634 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.916985 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.960672 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.974907 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 15:11:27 crc kubenswrapper[4743]: I0310 15:11:27.983724 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.048885 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.091873 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.096202 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.117244 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.135641 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.175894 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.251160 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.270543 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.272442 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.407709 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.417253 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.439850 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 15:11:28 crc kubenswrapper[4743]: I0310 15:11:28.868438 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.036157 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.123174 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.174526 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.183295 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.222274 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.249057 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.285797 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.290865 4743 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.339647 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.340241 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.389500 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.507428 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.518996 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.582241 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.627445 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.689931 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.692647 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.754532 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.821690 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 15:11:29 crc kubenswrapper[4743]: I0310 15:11:29.943512 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.024987 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.108226 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.215003 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.394002 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.432469 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.464310 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.531697 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.616020 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.782855 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.783657 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.789635 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.825550 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.914904 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:11:30 crc kubenswrapper[4743]: I0310 15:11:30.914945 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.035669 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.052434 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.106742 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.155226 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.165767 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.173414 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.271124 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.337270 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.367436 4743 patch_prober.go:28] interesting pod/route-controller-manager-5c4b8bd597-5pd2x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.367504 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" podUID="4f10d973-c107-4a86-bd97-dac12b7dd7b0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.367834 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.400170 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.478742 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.490916 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.555923 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.566983 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.630135 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.695459 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 15:11:31 crc kubenswrapper[4743]: I0310 15:11:31.880172 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.061034 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.078792 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.196661 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.259230 4743 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.298137 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.367356 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.449963 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.450898 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.468571 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.505761 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.528773 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.548596 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.552408 4743 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.586098 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.624772 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.674658 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.703039 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.742672 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.784605 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 15:11:32 crc kubenswrapper[4743]: I0310 15:11:32.784878 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.049235 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.057963 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.070748 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.079906 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.136462 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.170452 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.200844 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.219026 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.248630 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.271106 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.273323 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.335305 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.343415 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.360186 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.423402 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.531349 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.610015 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.626486 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.630663 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.675887 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.713286 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.713304 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.715359 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.747915 4743 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.752372 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66998f7cd-wm472" podStartSLOduration=45.752345467 podStartE2EDuration="45.752345467s" podCreationTimestamp="2026-03-10 15:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:17.936386064 +0000 UTC m=+342.643200812" watchObservedRunningTime="2026-03-10 15:11:33.752345467 +0000 UTC m=+358.459160255" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.753094 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" podStartSLOduration=45.75308521 podStartE2EDuration="45.75308521s" podCreationTimestamp="2026-03-10 15:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:18.938023212 +0000 UTC m=+343.644837960" watchObservedRunningTime="2026-03-10 15:11:33.75308521 +0000 UTC m=+358.459899968" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.755181 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.755242 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.755271 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66998f7cd-wm472","openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x"] Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.760218 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.808619 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.808597837 podStartE2EDuration="20.808597837s" podCreationTimestamp="2026-03-10 15:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:33.782455106 +0000 UTC m=+358.489269844" watchObservedRunningTime="2026-03-10 15:11:33.808597837 +0000 UTC m=+358.515412585" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.810491 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.837617 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 15:11:33 crc kubenswrapper[4743]: I0310 15:11:33.933392 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.033678 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.102339 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.350881 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.569249 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.673764 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.687857 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.756456 4743 patch_prober.go:28] interesting pod/route-controller-manager-5c4b8bd597-5pd2x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.757085 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" podUID="4f10d973-c107-4a86-bd97-dac12b7dd7b0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.792133 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.802348 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.803599 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 15:11:34 crc kubenswrapper[4743]: I0310 15:11:34.902565 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.000184 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.063973 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.308029 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.340430 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.352863 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.422224 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.422464 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.584645 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.587987 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.674310 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.675388 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.707976 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.814707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.815843 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.967107 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 15:11:35 crc kubenswrapper[4743]: I0310 15:11:35.975591 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.011661 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.059043 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.067118 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.086826 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.102997 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.104193 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.114899 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.147474 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.151140 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.170552 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.315572 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.331756 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.343715 4743 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.344050 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a" gracePeriod=5 Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.397913 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.461525 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.676637 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.701274 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.717702 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.829783 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.929483 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 15:11:36 crc kubenswrapper[4743]: I0310 15:11:36.989046 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.044012 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.113838 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.228030 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.338850 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.658988 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.666267 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.768023 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.779087 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.786032 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.848775 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.854132 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.878920 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 15:11:37 crc kubenswrapper[4743]: I0310 15:11:37.916991 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.131737 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.240404 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.260450 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.604457 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.632151 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.637640 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.733901 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.754063 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.909876 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 15:11:38 crc kubenswrapper[4743]: I0310 15:11:38.927099 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 15:11:39 crc kubenswrapper[4743]: I0310 15:11:39.165547 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 15:11:39 crc kubenswrapper[4743]: I0310 15:11:39.177129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 15:11:39 crc kubenswrapper[4743]: I0310 15:11:39.195583 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 15:11:39 crc kubenswrapper[4743]: I0310 15:11:39.322283 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 15:11:40 crc kubenswrapper[4743]: I0310 15:11:40.370522 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c4b8bd597-5pd2x" Mar 10 15:11:41 crc kubenswrapper[4743]: I0310 15:11:41.943446 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 15:11:41 crc kubenswrapper[4743]: I0310 15:11:41.943944 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.076399 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.076531 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.076599 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.076729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.076745 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.076801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.076905 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.076965 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.077019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.077687 4743 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.077713 4743 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.077723 4743 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.077733 4743 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.091354 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.123197 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.123272 4743 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a" exitCode=137 Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.123339 4743 scope.go:117] "RemoveContainer" containerID="5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.123494 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.162953 4743 scope.go:117] "RemoveContainer" containerID="5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a" Mar 10 15:11:42 crc kubenswrapper[4743]: E0310 15:11:42.163571 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a\": container with ID starting with 5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a not found: ID does not exist" containerID="5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.163645 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a"} err="failed to get container status \"5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a\": rpc error: code = NotFound desc = could not find container \"5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a\": container with ID starting with 5c2050005384533e52b64c5e6cac60bcd06ff2161faec33a67a2b18c71bc046a not found: ID does not exist" Mar 10 15:11:42 crc kubenswrapper[4743]: I0310 15:11:42.178579 4743 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:43 crc kubenswrapper[4743]: I0310 15:11:43.927114 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 15:11:47 crc kubenswrapper[4743]: I0310 15:11:47.943766 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 15:11:52 crc kubenswrapper[4743]: I0310 15:11:52.791356 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 15:11:56 crc kubenswrapper[4743]: I0310 15:11:56.373688 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 15:11:56 crc kubenswrapper[4743]: I0310 15:11:56.720751 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 15:11:57 crc kubenswrapper[4743]: I0310 15:11:57.403690 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 15:11:58 crc kubenswrapper[4743]: I0310 15:11:58.954043 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4743]: I0310 15:11:58.995600 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 15:11:59 crc kubenswrapper[4743]: I0310 15:11:59.378710 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 15:11:59 crc kubenswrapper[4743]: I0310 15:11:59.545511 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 15:11:59 crc kubenswrapper[4743]: I0310 15:11:59.701756 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 15:11:59 crc kubenswrapper[4743]: I0310 15:11:59.825752 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.132707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.172061 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552592-mmgm4"] Mar 10 15:12:00 crc kubenswrapper[4743]: E0310 15:12:00.172352 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" containerName="installer" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.172370 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" containerName="installer" Mar 10 15:12:00 crc kubenswrapper[4743]: E0310 15:12:00.172384 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.172391 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.172499 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbaf72e5-0721-41a6-957c-28ce7dbb7ff1" containerName="installer" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.172525 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.173001 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-mmgm4" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.178179 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.178210 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.178423 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.188037 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-mmgm4"] Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.196570 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mx5k\" (UniqueName: \"kubernetes.io/projected/9c7d8aeb-2177-44ee-b7e0-d745883183af-kube-api-access-5mx5k\") pod \"auto-csr-approver-29552592-mmgm4\" (UID: \"9c7d8aeb-2177-44ee-b7e0-d745883183af\") " pod="openshift-infra/auto-csr-approver-29552592-mmgm4" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.293142 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.298017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mx5k\" (UniqueName: \"kubernetes.io/projected/9c7d8aeb-2177-44ee-b7e0-d745883183af-kube-api-access-5mx5k\") pod \"auto-csr-approver-29552592-mmgm4\" (UID: \"9c7d8aeb-2177-44ee-b7e0-d745883183af\") " pod="openshift-infra/auto-csr-approver-29552592-mmgm4" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.320452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mx5k\" (UniqueName: \"kubernetes.io/projected/9c7d8aeb-2177-44ee-b7e0-d745883183af-kube-api-access-5mx5k\") pod \"auto-csr-approver-29552592-mmgm4\" (UID: \"9c7d8aeb-2177-44ee-b7e0-d745883183af\") " pod="openshift-infra/auto-csr-approver-29552592-mmgm4" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.491151 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-mmgm4" Mar 10 15:12:00 crc kubenswrapper[4743]: I0310 15:12:00.929997 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-mmgm4"] Mar 10 15:12:01 crc kubenswrapper[4743]: I0310 15:12:01.240092 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerID="b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510" exitCode=0 Mar 10 15:12:01 crc kubenswrapper[4743]: I0310 15:12:01.240221 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" event={"ID":"ba41eb29-8687-44ad-8001-642d0ff1fd7f","Type":"ContainerDied","Data":"b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510"} Mar 10 15:12:01 crc kubenswrapper[4743]: I0310 15:12:01.241059 4743 scope.go:117] "RemoveContainer" containerID="b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510" Mar 10 15:12:01 crc kubenswrapper[4743]: I0310 15:12:01.243088 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-mmgm4" event={"ID":"9c7d8aeb-2177-44ee-b7e0-d745883183af","Type":"ContainerStarted","Data":"3cc99977fdcc65730f014cb6b0a9592193bae2ed454bacae107ce7036bcb2162"} Mar 10 15:12:02 crc kubenswrapper[4743]: I0310 15:12:02.255354 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" event={"ID":"ba41eb29-8687-44ad-8001-642d0ff1fd7f","Type":"ContainerStarted","Data":"50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4"} Mar 10 15:12:02 crc kubenswrapper[4743]: I0310 15:12:02.257518 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:12:02 crc kubenswrapper[4743]: I0310 15:12:02.259854 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:12:02 crc kubenswrapper[4743]: I0310 15:12:02.941778 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 15:12:03 crc kubenswrapper[4743]: I0310 15:12:03.023190 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 15:12:03 crc kubenswrapper[4743]: I0310 15:12:03.265086 4743 generic.go:334] "Generic (PLEG): container finished" podID="9c7d8aeb-2177-44ee-b7e0-d745883183af" containerID="664574cde011ed7c51cca1b801dd20432041002097974fd6bc518261037628c6" exitCode=0 Mar 10 15:12:03 crc kubenswrapper[4743]: I0310 15:12:03.265179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-mmgm4" event={"ID":"9c7d8aeb-2177-44ee-b7e0-d745883183af","Type":"ContainerDied","Data":"664574cde011ed7c51cca1b801dd20432041002097974fd6bc518261037628c6"} Mar 10 15:12:04 crc kubenswrapper[4743]: I0310 15:12:04.445196 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 15:12:04 crc kubenswrapper[4743]: I0310 15:12:04.629369 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-mmgm4" Mar 10 15:12:04 crc kubenswrapper[4743]: I0310 15:12:04.664052 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mx5k\" (UniqueName: \"kubernetes.io/projected/9c7d8aeb-2177-44ee-b7e0-d745883183af-kube-api-access-5mx5k\") pod \"9c7d8aeb-2177-44ee-b7e0-d745883183af\" (UID: \"9c7d8aeb-2177-44ee-b7e0-d745883183af\") " Mar 10 15:12:04 crc kubenswrapper[4743]: I0310 15:12:04.687578 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7d8aeb-2177-44ee-b7e0-d745883183af-kube-api-access-5mx5k" (OuterVolumeSpecName: "kube-api-access-5mx5k") pod "9c7d8aeb-2177-44ee-b7e0-d745883183af" (UID: "9c7d8aeb-2177-44ee-b7e0-d745883183af"). InnerVolumeSpecName "kube-api-access-5mx5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:12:04 crc kubenswrapper[4743]: I0310 15:12:04.765243 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mx5k\" (UniqueName: \"kubernetes.io/projected/9c7d8aeb-2177-44ee-b7e0-d745883183af-kube-api-access-5mx5k\") on node \"crc\" DevicePath \"\"" Mar 10 15:12:05 crc kubenswrapper[4743]: I0310 15:12:05.281229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-mmgm4" event={"ID":"9c7d8aeb-2177-44ee-b7e0-d745883183af","Type":"ContainerDied","Data":"3cc99977fdcc65730f014cb6b0a9592193bae2ed454bacae107ce7036bcb2162"} Mar 10 15:12:05 crc kubenswrapper[4743]: I0310 15:12:05.281282 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc99977fdcc65730f014cb6b0a9592193bae2ed454bacae107ce7036bcb2162" Mar 10 15:12:05 crc kubenswrapper[4743]: I0310 15:12:05.281390 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-mmgm4" Mar 10 15:12:05 crc kubenswrapper[4743]: I0310 15:12:05.345547 4743 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 15:12:09 crc kubenswrapper[4743]: I0310 15:12:09.761656 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 15:12:10 crc kubenswrapper[4743]: I0310 15:12:10.658312 4743 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 15:12:16 crc kubenswrapper[4743]: I0310 15:12:16.107873 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.341472 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-98tt6"] Mar 10 15:12:50 crc kubenswrapper[4743]: E0310 15:12:50.342338 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7d8aeb-2177-44ee-b7e0-d745883183af" containerName="oc" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.342354 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7d8aeb-2177-44ee-b7e0-d745883183af" containerName="oc" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.342475 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7d8aeb-2177-44ee-b7e0-d745883183af" containerName="oc" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.342928 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.377865 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-98tt6"] Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.468404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-trusted-ca\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.468472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.468565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.468592 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-registry-tls\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.468621 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-bound-sa-token\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.468642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.468807 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-registry-certificates\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.468959 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jz4\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-kube-api-access-n8jz4\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.495717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.570328 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.571204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-registry-certificates\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.571320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jz4\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-kube-api-access-n8jz4\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.571437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-trusted-ca\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.571695 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.571875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-registry-tls\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.572084 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-bound-sa-token\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.572500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.574265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-registry-certificates\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.575621 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-trusted-ca\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.581616 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-registry-tls\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.583674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.600300 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jz4\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-kube-api-access-n8jz4\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.602562 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d2bfbed-4ff4-4817-8ece-d4e5f12123cd-bound-sa-token\") pod \"image-registry-66df7c8f76-98tt6\" (UID: \"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:50 crc kubenswrapper[4743]: I0310 15:12:50.666547 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:51 crc kubenswrapper[4743]: I0310 15:12:51.087769 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-98tt6"] Mar 10 15:12:51 crc kubenswrapper[4743]: I0310 15:12:51.548896 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" event={"ID":"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd","Type":"ContainerStarted","Data":"267aa1cc3f9cde45986fe2be1608c2d63616ede44bf3bf352cc66d321fb8f57b"} Mar 10 15:12:51 crc kubenswrapper[4743]: I0310 15:12:51.548951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" event={"ID":"2d2bfbed-4ff4-4817-8ece-d4e5f12123cd","Type":"ContainerStarted","Data":"7a2be7225db0d9ee920090f2a04d76c0bc78fa97ae077a44353ce5bef7b6527d"} Mar 10 15:12:51 crc kubenswrapper[4743]: I0310 15:12:51.549047 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:12:51 crc kubenswrapper[4743]: I0310 15:12:51.571008 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" podStartSLOduration=1.570980902 podStartE2EDuration="1.570980902s" podCreationTimestamp="2026-03-10 15:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:12:51.567557238 +0000 UTC m=+436.274371996" watchObservedRunningTime="2026-03-10 15:12:51.570980902 +0000 UTC m=+436.277795660" Mar 10 15:13:02 crc kubenswrapper[4743]: I0310 15:13:02.259615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:13:02 crc kubenswrapper[4743]: I0310 15:13:02.260569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:13:02 crc kubenswrapper[4743]: I0310 15:13:02.261689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:13:02 crc kubenswrapper[4743]: I0310 15:13:02.267943 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:13:02 crc kubenswrapper[4743]: I0310 15:13:02.415435 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:13:03 crc kubenswrapper[4743]: I0310 15:13:03.388372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:13:03 crc kubenswrapper[4743]: I0310 15:13:03.388943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:13:03 crc kubenswrapper[4743]: I0310 15:13:03.396165 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:13:03 crc kubenswrapper[4743]: I0310 15:13:03.397153 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:13:03 crc kubenswrapper[4743]: I0310 15:13:03.616027 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:13:03 crc kubenswrapper[4743]: I0310 15:13:03.616094 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:13:03 crc kubenswrapper[4743]: I0310 15:13:03.655957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"237d6a344833c3d7b985872061836f8019a3b7c98361f2f4a423392f6358d64f"} Mar 10 15:13:03 crc kubenswrapper[4743]: I0310 15:13:03.656047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bfee04fe00df71aff6a120a12426f7f3813013be1a9f822c2648970c3f4941b5"} Mar 10 15:13:03 crc kubenswrapper[4743]: W0310 15:13:03.989287 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e9a185684836b6a41b5542b1d33849d30a706526527244504626cb41c3ca756f WatchSource:0}: Error finding container e9a185684836b6a41b5542b1d33849d30a706526527244504626cb41c3ca756f: Status 404 returned error can't find the container with id e9a185684836b6a41b5542b1d33849d30a706526527244504626cb41c3ca756f Mar 10 15:13:04 crc kubenswrapper[4743]: W0310 15:13:04.118181 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ab939f17a848da7855b5915d7c29af0e3e21c0a9960863019605f50097a5c01d WatchSource:0}: Error finding container ab939f17a848da7855b5915d7c29af0e3e21c0a9960863019605f50097a5c01d: Status 404 returned error can't find the container with id ab939f17a848da7855b5915d7c29af0e3e21c0a9960863019605f50097a5c01d Mar 10 15:13:04 crc kubenswrapper[4743]: I0310 15:13:04.662781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1e2944638b173655109a8bbc15b752db2366365c4e6681c70d5d3a1429c89a81"} Mar 10 15:13:04 crc kubenswrapper[4743]: I0310 15:13:04.663256 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e9a185684836b6a41b5542b1d33849d30a706526527244504626cb41c3ca756f"} Mar 10 15:13:04 crc kubenswrapper[4743]: I0310 15:13:04.663504 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:13:04 crc kubenswrapper[4743]: I0310 15:13:04.664116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2722edfa8adad4eea547ecaadc2ca16dd4c405157213ddc0b58834b88df91a86"} Mar 10 15:13:04 crc kubenswrapper[4743]: I0310 15:13:04.664143 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ab939f17a848da7855b5915d7c29af0e3e21c0a9960863019605f50097a5c01d"} Mar 10 15:13:10 crc kubenswrapper[4743]: I0310 15:13:10.676271 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-98tt6" Mar 10 15:13:10 crc kubenswrapper[4743]: I0310 15:13:10.794344 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mxlth"] Mar 10 15:13:11 crc kubenswrapper[4743]: I0310 15:13:11.253429 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:13:11 crc kubenswrapper[4743]: I0310 15:13:11.253911 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.860636 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f6tgf"] Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.861998 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f6tgf" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" containerName="registry-server" containerID="cri-o://852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d" gracePeriod=30 Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.867133 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qbmf"] Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.867406 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8qbmf" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerName="registry-server" containerID="cri-o://1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8" gracePeriod=30 Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.886231 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l66d6"] Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.886521 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" containerID="cri-o://50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4" gracePeriod=30 Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.893575 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxzjq"] Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.893952 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rxzjq" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerName="registry-server" containerID="cri-o://7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d" gracePeriod=30 Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.907362 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hq6n"] Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.907763 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2hq6n" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="registry-server" containerID="cri-o://859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd" gracePeriod=30 Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.915070 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2tpls"] Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.916925 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:18 crc kubenswrapper[4743]: I0310 15:13:18.975847 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2tpls"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.064431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f6c8824b-120a-4480-bdad-a18584d52bad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.064900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh47d\" (UniqueName: \"kubernetes.io/projected/f6c8824b-120a-4480-bdad-a18584d52bad-kube-api-access-sh47d\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.066130 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6c8824b-120a-4480-bdad-a18584d52bad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.167565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh47d\" (UniqueName: \"kubernetes.io/projected/f6c8824b-120a-4480-bdad-a18584d52bad-kube-api-access-sh47d\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.167629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6c8824b-120a-4480-bdad-a18584d52bad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.167694 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f6c8824b-120a-4480-bdad-a18584d52bad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.177018 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f6c8824b-120a-4480-bdad-a18584d52bad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.184757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6c8824b-120a-4480-bdad-a18584d52bad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.198099 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh47d\" (UniqueName: \"kubernetes.io/projected/f6c8824b-120a-4480-bdad-a18584d52bad-kube-api-access-sh47d\") pod \"marketplace-operator-79b997595-2tpls\" (UID: \"f6c8824b-120a-4480-bdad-a18584d52bad\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.240004 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.311655 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.370722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzd7j\" (UniqueName: \"kubernetes.io/projected/5fb16ec3-618c-4095-a3e7-3f59920d921b-kube-api-access-gzd7j\") pod \"5fb16ec3-618c-4095-a3e7-3f59920d921b\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.370795 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-utilities\") pod \"5fb16ec3-618c-4095-a3e7-3f59920d921b\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.370852 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-catalog-content\") pod \"5fb16ec3-618c-4095-a3e7-3f59920d921b\" (UID: \"5fb16ec3-618c-4095-a3e7-3f59920d921b\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.373375 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-utilities" (OuterVolumeSpecName: "utilities") pod "5fb16ec3-618c-4095-a3e7-3f59920d921b" (UID: "5fb16ec3-618c-4095-a3e7-3f59920d921b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.374331 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.379600 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fb16ec3-618c-4095-a3e7-3f59920d921b-kube-api-access-gzd7j" (OuterVolumeSpecName: "kube-api-access-gzd7j") pod "5fb16ec3-618c-4095-a3e7-3f59920d921b" (UID: "5fb16ec3-618c-4095-a3e7-3f59920d921b"). InnerVolumeSpecName "kube-api-access-gzd7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.418903 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.433566 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.458246 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.474524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m29gj\" (UniqueName: \"kubernetes.io/projected/d815b880-6675-42e2-8380-3e1aaae065a7-kube-api-access-m29gj\") pod \"d815b880-6675-42e2-8380-3e1aaae065a7\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.474627 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-catalog-content\") pod \"d815b880-6675-42e2-8380-3e1aaae065a7\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.474728 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-utilities\") pod \"d815b880-6675-42e2-8380-3e1aaae065a7\" (UID: \"d815b880-6675-42e2-8380-3e1aaae065a7\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.474991 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzd7j\" (UniqueName: \"kubernetes.io/projected/5fb16ec3-618c-4095-a3e7-3f59920d921b-kube-api-access-gzd7j\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.475045 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.475799 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-utilities" (OuterVolumeSpecName: "utilities") pod "d815b880-6675-42e2-8380-3e1aaae065a7" (UID: "d815b880-6675-42e2-8380-3e1aaae065a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.481770 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d815b880-6675-42e2-8380-3e1aaae065a7-kube-api-access-m29gj" (OuterVolumeSpecName: "kube-api-access-m29gj") pod "d815b880-6675-42e2-8380-3e1aaae065a7" (UID: "d815b880-6675-42e2-8380-3e1aaae065a7"). InnerVolumeSpecName "kube-api-access-m29gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.490237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fb16ec3-618c-4095-a3e7-3f59920d921b" (UID: "5fb16ec3-618c-4095-a3e7-3f59920d921b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.541193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d815b880-6675-42e2-8380-3e1aaae065a7" (UID: "d815b880-6675-42e2-8380-3e1aaae065a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-catalog-content\") pod \"0eb81c77-0afe-417e-a904-90a76e45f309\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576403 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-utilities\") pod \"0eb81c77-0afe-417e-a904-90a76e45f309\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-operator-metrics\") pod \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576491 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-catalog-content\") pod \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d58x2\" (UniqueName: \"kubernetes.io/projected/0eb81c77-0afe-417e-a904-90a76e45f309-kube-api-access-d58x2\") pod \"0eb81c77-0afe-417e-a904-90a76e45f309\" (UID: \"0eb81c77-0afe-417e-a904-90a76e45f309\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-utilities\") pod \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576576 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-trusted-ca\") pod \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576662 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hnpv\" (UniqueName: \"kubernetes.io/projected/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-kube-api-access-8hnpv\") pod \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\" (UID: \"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.576690 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pvkf\" (UniqueName: \"kubernetes.io/projected/ba41eb29-8687-44ad-8001-642d0ff1fd7f-kube-api-access-2pvkf\") pod \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\" (UID: \"ba41eb29-8687-44ad-8001-642d0ff1fd7f\") " Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.577020 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb16ec3-618c-4095-a3e7-3f59920d921b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.577043 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.577058 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m29gj\" (UniqueName: \"kubernetes.io/projected/d815b880-6675-42e2-8380-3e1aaae065a7-kube-api-access-m29gj\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.577073 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d815b880-6675-42e2-8380-3e1aaae065a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.579166 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-utilities" (OuterVolumeSpecName: "utilities") pod "5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" (UID: "5aa7f8aa-e0be-4d07-aecd-ccb769d0713c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.579197 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ba41eb29-8687-44ad-8001-642d0ff1fd7f" (UID: "ba41eb29-8687-44ad-8001-642d0ff1fd7f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.579709 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-utilities" (OuterVolumeSpecName: "utilities") pod "0eb81c77-0afe-417e-a904-90a76e45f309" (UID: "0eb81c77-0afe-417e-a904-90a76e45f309"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.580778 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-kube-api-access-8hnpv" (OuterVolumeSpecName: "kube-api-access-8hnpv") pod "5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" (UID: "5aa7f8aa-e0be-4d07-aecd-ccb769d0713c"). InnerVolumeSpecName "kube-api-access-8hnpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.581085 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb81c77-0afe-417e-a904-90a76e45f309-kube-api-access-d58x2" (OuterVolumeSpecName: "kube-api-access-d58x2") pod "0eb81c77-0afe-417e-a904-90a76e45f309" (UID: "0eb81c77-0afe-417e-a904-90a76e45f309"). InnerVolumeSpecName "kube-api-access-d58x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.582218 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba41eb29-8687-44ad-8001-642d0ff1fd7f-kube-api-access-2pvkf" (OuterVolumeSpecName: "kube-api-access-2pvkf") pod "ba41eb29-8687-44ad-8001-642d0ff1fd7f" (UID: "ba41eb29-8687-44ad-8001-642d0ff1fd7f"). InnerVolumeSpecName "kube-api-access-2pvkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.583249 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ba41eb29-8687-44ad-8001-642d0ff1fd7f" (UID: "ba41eb29-8687-44ad-8001-642d0ff1fd7f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.610605 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" (UID: "5aa7f8aa-e0be-4d07-aecd-ccb769d0713c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.678916 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.678957 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.678969 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.678978 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d58x2\" (UniqueName: \"kubernetes.io/projected/0eb81c77-0afe-417e-a904-90a76e45f309-kube-api-access-d58x2\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.678988 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.678998 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba41eb29-8687-44ad-8001-642d0ff1fd7f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.679005 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hnpv\" (UniqueName: \"kubernetes.io/projected/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c-kube-api-access-8hnpv\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.679014 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pvkf\" (UniqueName: \"kubernetes.io/projected/ba41eb29-8687-44ad-8001-642d0ff1fd7f-kube-api-access-2pvkf\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.712098 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0eb81c77-0afe-417e-a904-90a76e45f309" (UID: "0eb81c77-0afe-417e-a904-90a76e45f309"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.774260 4743 generic.go:334] "Generic (PLEG): container finished" podID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerID="7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d" exitCode=0 Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.774327 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rxzjq" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.774324 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxzjq" event={"ID":"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c","Type":"ContainerDied","Data":"7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.775109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rxzjq" event={"ID":"5aa7f8aa-e0be-4d07-aecd-ccb769d0713c","Type":"ContainerDied","Data":"4d83bcaa8e40a02245c4ee0eb2fad34f8c74f1805788c3a5b470868a44c1a72d"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.775201 4743 scope.go:117] "RemoveContainer" containerID="7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.779234 4743 generic.go:334] "Generic (PLEG): container finished" podID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerID="1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8" exitCode=0 Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.779361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qbmf" event={"ID":"5fb16ec3-618c-4095-a3e7-3f59920d921b","Type":"ContainerDied","Data":"1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.779463 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qbmf" event={"ID":"5fb16ec3-618c-4095-a3e7-3f59920d921b","Type":"ContainerDied","Data":"e1effe78e4632c24d58fe7ba06196e0b73c6b659607c92b8058ec8f70763f4d4"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.779619 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qbmf" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.779983 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb81c77-0afe-417e-a904-90a76e45f309-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.782694 4743 generic.go:334] "Generic (PLEG): container finished" podID="0eb81c77-0afe-417e-a904-90a76e45f309" containerID="859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd" exitCode=0 Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.782871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hq6n" event={"ID":"0eb81c77-0afe-417e-a904-90a76e45f309","Type":"ContainerDied","Data":"859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.782946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hq6n" event={"ID":"0eb81c77-0afe-417e-a904-90a76e45f309","Type":"ContainerDied","Data":"48fc5507d84a7ed5f994d2c8cc0aedd3133c0c9d9a12213984187744328afb84"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.783077 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hq6n" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.786270 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerID="50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4" exitCode=0 Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.786368 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.786382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" event={"ID":"ba41eb29-8687-44ad-8001-642d0ff1fd7f","Type":"ContainerDied","Data":"50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.786988 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l66d6" event={"ID":"ba41eb29-8687-44ad-8001-642d0ff1fd7f","Type":"ContainerDied","Data":"8e0f192db5deba03e306442575a0b6730c9271dc4a719a595fe0fcb030396891"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.790955 4743 generic.go:334] "Generic (PLEG): container finished" podID="d815b880-6675-42e2-8380-3e1aaae065a7" containerID="852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d" exitCode=0 Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.790991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6tgf" event={"ID":"d815b880-6675-42e2-8380-3e1aaae065a7","Type":"ContainerDied","Data":"852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.791017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6tgf" event={"ID":"d815b880-6675-42e2-8380-3e1aaae065a7","Type":"ContainerDied","Data":"da0735b1e6092952ace9c7df1ba541c9a21b7b47de758285b53ff418e221ccb2"} Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.791015 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6tgf" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.800680 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2tpls"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.804729 4743 scope.go:117] "RemoveContainer" containerID="11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.811044 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxzjq"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.817507 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rxzjq"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.836253 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qbmf"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.837735 4743 scope.go:117] "RemoveContainer" containerID="2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.840365 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8qbmf"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.861027 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f6tgf"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.867890 4743 scope.go:117] "RemoveContainer" containerID="7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.869333 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f6tgf"] Mar 10 15:13:19 crc kubenswrapper[4743]: E0310 15:13:19.869951 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d\": container with ID starting with 7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d not found: ID does not exist" containerID="7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.870018 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d"} err="failed to get container status \"7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d\": rpc error: code = NotFound desc = could not find container \"7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d\": container with ID starting with 7ff4e776b6f39d5c5a685060f276c21c4f9ba04c095685b875c05e71b760b96d not found: ID does not exist" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.870094 4743 scope.go:117] "RemoveContainer" containerID="11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80" Mar 10 15:13:19 crc kubenswrapper[4743]: E0310 15:13:19.870844 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80\": container with ID starting with 11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80 not found: ID does not exist" containerID="11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.870896 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80"} err="failed to get container status \"11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80\": rpc error: code = NotFound desc = could not find container \"11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80\": container with ID starting with 11cdd2daf8b005d8760747f4bb0551f4333944d29b3d05f66f6e8222a513da80 not found: ID does not exist" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.870935 4743 scope.go:117] "RemoveContainer" containerID="2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad" Mar 10 15:13:19 crc kubenswrapper[4743]: E0310 15:13:19.871357 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad\": container with ID starting with 2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad not found: ID does not exist" containerID="2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.871387 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad"} err="failed to get container status \"2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad\": rpc error: code = NotFound desc = could not find container \"2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad\": container with ID starting with 2b0b99538c87a3565bee477a33ea33e59149d909a8debd12b0292152571a45ad not found: ID does not exist" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.871404 4743 scope.go:117] "RemoveContainer" containerID="1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.882007 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l66d6"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.889722 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l66d6"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.898781 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hq6n"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.904301 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2hq6n"] Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.917887 4743 scope.go:117] "RemoveContainer" containerID="5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.930486 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" path="/var/lib/kubelet/pods/0eb81c77-0afe-417e-a904-90a76e45f309/volumes" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.931210 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" path="/var/lib/kubelet/pods/5aa7f8aa-e0be-4d07-aecd-ccb769d0713c/volumes" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.934948 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" path="/var/lib/kubelet/pods/5fb16ec3-618c-4095-a3e7-3f59920d921b/volumes" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.939036 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" path="/var/lib/kubelet/pods/ba41eb29-8687-44ad-8001-642d0ff1fd7f/volumes" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.939691 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" path="/var/lib/kubelet/pods/d815b880-6675-42e2-8380-3e1aaae065a7/volumes" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.944379 4743 scope.go:117] "RemoveContainer" containerID="7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.967974 4743 scope.go:117] "RemoveContainer" containerID="1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8" Mar 10 15:13:19 crc kubenswrapper[4743]: E0310 15:13:19.968944 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8\": container with ID starting with 1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8 not found: ID does not exist" containerID="1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.969038 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8"} err="failed to get container status \"1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8\": rpc error: code = NotFound desc = could not find container \"1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8\": container with ID starting with 1a0d2ed0bbc1a4655955fe752389da147807175ee376997f38ef83a75b6d6ca8 not found: ID does not exist" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.969107 4743 scope.go:117] "RemoveContainer" containerID="5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe" Mar 10 15:13:19 crc kubenswrapper[4743]: E0310 15:13:19.969774 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe\": container with ID starting with 5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe not found: ID does not exist" containerID="5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.969902 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe"} err="failed to get container status \"5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe\": rpc error: code = NotFound desc = could not find container \"5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe\": container with ID starting with 5b72a783189a4ffcaf13cb154c4716e75564e5172480381710a4742de4781bbe not found: ID does not exist" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.969944 4743 scope.go:117] "RemoveContainer" containerID="7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1" Mar 10 15:13:19 crc kubenswrapper[4743]: E0310 15:13:19.970534 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1\": container with ID starting with 7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1 not found: ID does not exist" containerID="7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.970599 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1"} err="failed to get container status \"7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1\": rpc error: code = NotFound desc = could not find container \"7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1\": container with ID starting with 7530c91f623b33c052d852bca1b73c4a80befaf7b6ab38f1c9dbbfc74bdd07a1 not found: ID does not exist" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.970639 4743 scope.go:117] "RemoveContainer" containerID="859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd" Mar 10 15:13:19 crc kubenswrapper[4743]: I0310 15:13:19.986762 4743 scope.go:117] "RemoveContainer" containerID="45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.049797 4743 scope.go:117] "RemoveContainer" containerID="39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.067499 4743 scope.go:117] "RemoveContainer" containerID="859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd" Mar 10 15:13:20 crc kubenswrapper[4743]: E0310 15:13:20.068240 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd\": container with ID starting with 859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd not found: ID does not exist" containerID="859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.068305 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd"} err="failed to get container status \"859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd\": rpc error: code = NotFound desc = could not find container \"859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd\": container with ID starting with 859080fa24e41a8a71815c890a48470dc89ee44e0f7ba8e747a23402cdb494dd not found: ID does not exist" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.068348 4743 scope.go:117] "RemoveContainer" containerID="45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a" Mar 10 15:13:20 crc kubenswrapper[4743]: E0310 15:13:20.068872 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a\": container with ID starting with 45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a not found: ID does not exist" containerID="45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.068931 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a"} err="failed to get container status \"45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a\": rpc error: code = NotFound desc = could not find container \"45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a\": container with ID starting with 45f2437260223b2f126a4de5c95edc5b26659c9efecc90bd06aa4f6841a9ff3a not found: ID does not exist" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.068969 4743 scope.go:117] "RemoveContainer" containerID="39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9" Mar 10 15:13:20 crc kubenswrapper[4743]: E0310 15:13:20.069295 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9\": container with ID starting with 39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9 not found: ID does not exist" containerID="39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.069333 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9"} err="failed to get container status \"39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9\": rpc error: code = NotFound desc = could not find container \"39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9\": container with ID starting with 39f7927c0ff40bc08778f624078b95445acd7cee98a830918f8899cd9bd0eab9 not found: ID does not exist" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.069357 4743 scope.go:117] "RemoveContainer" containerID="50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.085513 4743 scope.go:117] "RemoveContainer" containerID="b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.102939 4743 scope.go:117] "RemoveContainer" containerID="50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4" Mar 10 15:13:20 crc kubenswrapper[4743]: E0310 15:13:20.103383 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4\": container with ID starting with 50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4 not found: ID does not exist" containerID="50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.103416 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4"} err="failed to get container status \"50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4\": rpc error: code = NotFound desc = could not find container \"50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4\": container with ID starting with 50527eefa37a4882ea5e7dfa6e0e40d70a983a56cf50cc8cf4fbf379a9d969d4 not found: ID does not exist" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.103443 4743 scope.go:117] "RemoveContainer" containerID="b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510" Mar 10 15:13:20 crc kubenswrapper[4743]: E0310 15:13:20.103780 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510\": container with ID starting with b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510 not found: ID does not exist" containerID="b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.103801 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510"} err="failed to get container status \"b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510\": rpc error: code = NotFound desc = could not find container \"b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510\": container with ID starting with b7f4113b98cf138b43128d93a8fb760fe84f6f873029cc9c6160627c3508f510 not found: ID does not exist" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.103901 4743 scope.go:117] "RemoveContainer" containerID="852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.121036 4743 scope.go:117] "RemoveContainer" containerID="865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.140851 4743 scope.go:117] "RemoveContainer" containerID="56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.163642 4743 scope.go:117] "RemoveContainer" containerID="852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d" Mar 10 15:13:20 crc kubenswrapper[4743]: E0310 15:13:20.164544 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d\": container with ID starting with 852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d not found: ID does not exist" containerID="852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.164850 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d"} err="failed to get container status \"852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d\": rpc error: code = NotFound desc = could not find container \"852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d\": container with ID starting with 852f4ae3e82aaac8673dc6377386e1c1800dec67f0a2c82dcf1a753b2b5cb83d not found: ID does not exist" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.164901 4743 scope.go:117] "RemoveContainer" containerID="865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8" Mar 10 15:13:20 crc kubenswrapper[4743]: E0310 15:13:20.165514 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8\": container with ID starting with 865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8 not found: ID does not exist" containerID="865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.165543 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8"} err="failed to get container status \"865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8\": rpc error: code = NotFound desc = could not find container \"865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8\": container with ID starting with 865ce33157e10b0c96ecb2b00bff0593ce4f5ced465b22789eb565356b7523c8 not found: ID does not exist" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.165561 4743 scope.go:117] "RemoveContainer" containerID="56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec" Mar 10 15:13:20 crc kubenswrapper[4743]: E0310 15:13:20.165854 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec\": container with ID starting with 56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec not found: ID does not exist" containerID="56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.165877 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec"} err="failed to get container status \"56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec\": rpc error: code = NotFound desc = could not find container \"56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec\": container with ID starting with 56846df52f7073e77c2b36c6d3a43e667255fcc80682a0a0680cccb633fc4fec not found: ID does not exist" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.806636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" event={"ID":"f6c8824b-120a-4480-bdad-a18584d52bad","Type":"ContainerStarted","Data":"2bab0f481bdfcf7930a7c1844f095d0b8da497ecd118778473d91784d56799bb"} Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.806707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" event={"ID":"f6c8824b-120a-4480-bdad-a18584d52bad","Type":"ContainerStarted","Data":"1c5de1e8e6ef61f5d3782b1c46206c81983dfb1f475fc639d01f3ed4db5c0abe"} Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.807119 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.811986 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" Mar 10 15:13:20 crc kubenswrapper[4743]: I0310 15:13:20.825665 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2tpls" podStartSLOduration=2.8256416 podStartE2EDuration="2.8256416s" podCreationTimestamp="2026-03-10 15:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:13:20.82228418 +0000 UTC m=+465.529098928" watchObservedRunningTime="2026-03-10 15:13:20.8256416 +0000 UTC m=+465.532456348" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.082748 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9wrsh"] Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083435 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" containerName="extract-utilities" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083454 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" containerName="extract-utilities" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083469 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="extract-content" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083478 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="extract-content" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083498 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" containerName="extract-content" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083506 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" containerName="extract-content" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083516 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083522 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083534 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerName="extract-utilities" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083542 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerName="extract-utilities" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083553 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083562 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083571 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerName="extract-content" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083579 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerName="extract-content" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083592 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="extract-utilities" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083599 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="extract-utilities" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083610 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083617 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083625 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083632 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083642 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerName="extract-content" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083650 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerName="extract-content" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083660 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083667 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083680 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerName="extract-utilities" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083689 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerName="extract-utilities" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083846 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aa7f8aa-e0be-4d07-aecd-ccb769d0713c" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083861 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb16ec3-618c-4095-a3e7-3f59920d921b" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083874 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb81c77-0afe-417e-a904-90a76e45f309" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083884 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.083891 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d815b880-6675-42e2-8380-3e1aaae065a7" containerName="registry-server" Mar 10 15:13:21 crc kubenswrapper[4743]: E0310 15:13:21.083998 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.084006 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.084095 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba41eb29-8687-44ad-8001-642d0ff1fd7f" containerName="marketplace-operator" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.084720 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.087630 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.095776 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wrsh"] Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.200937 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d17452-34b9-4c1e-a95c-3840493fe263-catalog-content\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.201090 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv5wp\" (UniqueName: \"kubernetes.io/projected/26d17452-34b9-4c1e-a95c-3840493fe263-kube-api-access-xv5wp\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.201134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d17452-34b9-4c1e-a95c-3840493fe263-utilities\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.276272 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cvw9l"] Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.278641 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.281175 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.287559 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cvw9l"] Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.305751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d17452-34b9-4c1e-a95c-3840493fe263-utilities\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.305849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d17452-34b9-4c1e-a95c-3840493fe263-catalog-content\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.305900 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5wp\" (UniqueName: \"kubernetes.io/projected/26d17452-34b9-4c1e-a95c-3840493fe263-kube-api-access-xv5wp\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.306616 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d17452-34b9-4c1e-a95c-3840493fe263-utilities\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.306902 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d17452-34b9-4c1e-a95c-3840493fe263-catalog-content\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.326010 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv5wp\" (UniqueName: \"kubernetes.io/projected/26d17452-34b9-4c1e-a95c-3840493fe263-kube-api-access-xv5wp\") pod \"redhat-marketplace-9wrsh\" (UID: \"26d17452-34b9-4c1e-a95c-3840493fe263\") " pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.407372 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp25n\" (UniqueName: \"kubernetes.io/projected/190a3104-25c3-4135-bd63-b9e56380c9b9-kube-api-access-fp25n\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.407521 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-catalog-content\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.407571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-utilities\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.417315 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.508584 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-catalog-content\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.508954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-utilities\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.509179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-catalog-content\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.509223 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp25n\" (UniqueName: \"kubernetes.io/projected/190a3104-25c3-4135-bd63-b9e56380c9b9-kube-api-access-fp25n\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.509376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-utilities\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.539212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp25n\" (UniqueName: \"kubernetes.io/projected/190a3104-25c3-4135-bd63-b9e56380c9b9-kube-api-access-fp25n\") pod \"redhat-operators-cvw9l\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.616279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.638842 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wrsh"] Mar 10 15:13:21 crc kubenswrapper[4743]: W0310 15:13:21.652281 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d17452_34b9_4c1e_a95c_3840493fe263.slice/crio-1d8f4366de9a25aa51be04a30157e176ed9a4d1fd0d435579fc46bddaba611e2 WatchSource:0}: Error finding container 1d8f4366de9a25aa51be04a30157e176ed9a4d1fd0d435579fc46bddaba611e2: Status 404 returned error can't find the container with id 1d8f4366de9a25aa51be04a30157e176ed9a4d1fd0d435579fc46bddaba611e2 Mar 10 15:13:21 crc kubenswrapper[4743]: I0310 15:13:21.827770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wrsh" event={"ID":"26d17452-34b9-4c1e-a95c-3840493fe263","Type":"ContainerStarted","Data":"1d8f4366de9a25aa51be04a30157e176ed9a4d1fd0d435579fc46bddaba611e2"} Mar 10 15:13:22 crc kubenswrapper[4743]: I0310 15:13:22.048588 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cvw9l"] Mar 10 15:13:22 crc kubenswrapper[4743]: I0310 15:13:22.837664 4743 generic.go:334] "Generic (PLEG): container finished" podID="26d17452-34b9-4c1e-a95c-3840493fe263" containerID="c2de93cbabfbe9696bff877b968b7a5443cc189b57304cedb7f544806e9ba00e" exitCode=0 Mar 10 15:13:22 crc kubenswrapper[4743]: I0310 15:13:22.837744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wrsh" event={"ID":"26d17452-34b9-4c1e-a95c-3840493fe263","Type":"ContainerDied","Data":"c2de93cbabfbe9696bff877b968b7a5443cc189b57304cedb7f544806e9ba00e"} Mar 10 15:13:22 crc kubenswrapper[4743]: I0310 15:13:22.842845 4743 generic.go:334] "Generic (PLEG): container finished" podID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerID="707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f" exitCode=0 Mar 10 15:13:22 crc kubenswrapper[4743]: I0310 15:13:22.842954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvw9l" event={"ID":"190a3104-25c3-4135-bd63-b9e56380c9b9","Type":"ContainerDied","Data":"707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f"} Mar 10 15:13:22 crc kubenswrapper[4743]: I0310 15:13:22.843028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvw9l" event={"ID":"190a3104-25c3-4135-bd63-b9e56380c9b9","Type":"ContainerStarted","Data":"7aee3e398014285b016be28813ae9bfe6ea345f3a1ae1189d515c3cfdacd535a"} Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.482356 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2jrfb"] Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.483982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.487053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.499052 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2jrfb"] Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.641904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxkn\" (UniqueName: \"kubernetes.io/projected/e008807d-9026-49b7-9a83-c375cd1f23cc-kube-api-access-xhxkn\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.642027 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e008807d-9026-49b7-9a83-c375cd1f23cc-catalog-content\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.642162 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e008807d-9026-49b7-9a83-c375cd1f23cc-utilities\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.678113 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lx45d"] Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.679803 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.682090 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.691296 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lx45d"] Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.744292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxkn\" (UniqueName: \"kubernetes.io/projected/e008807d-9026-49b7-9a83-c375cd1f23cc-kube-api-access-xhxkn\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.744382 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e008807d-9026-49b7-9a83-c375cd1f23cc-catalog-content\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.744450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e008807d-9026-49b7-9a83-c375cd1f23cc-utilities\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.745200 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e008807d-9026-49b7-9a83-c375cd1f23cc-utilities\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.745270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e008807d-9026-49b7-9a83-c375cd1f23cc-catalog-content\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.769993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxkn\" (UniqueName: \"kubernetes.io/projected/e008807d-9026-49b7-9a83-c375cd1f23cc-kube-api-access-xhxkn\") pod \"certified-operators-2jrfb\" (UID: \"e008807d-9026-49b7-9a83-c375cd1f23cc\") " pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.804352 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.847787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ffea72-0677-4f63-b0ba-5501881256da-utilities\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.847856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrhh\" (UniqueName: \"kubernetes.io/projected/f6ffea72-0677-4f63-b0ba-5501881256da-kube-api-access-jtrhh\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.847975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ffea72-0677-4f63-b0ba-5501881256da-catalog-content\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.949648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ffea72-0677-4f63-b0ba-5501881256da-utilities\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.950312 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ffea72-0677-4f63-b0ba-5501881256da-utilities\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.950785 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrhh\" (UniqueName: \"kubernetes.io/projected/f6ffea72-0677-4f63-b0ba-5501881256da-kube-api-access-jtrhh\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.951346 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ffea72-0677-4f63-b0ba-5501881256da-catalog-content\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.953108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ffea72-0677-4f63-b0ba-5501881256da-catalog-content\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:23 crc kubenswrapper[4743]: I0310 15:13:23.972597 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrhh\" (UniqueName: \"kubernetes.io/projected/f6ffea72-0677-4f63-b0ba-5501881256da-kube-api-access-jtrhh\") pod \"community-operators-lx45d\" (UID: \"f6ffea72-0677-4f63-b0ba-5501881256da\") " pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.006170 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.241844 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2jrfb"] Mar 10 15:13:24 crc kubenswrapper[4743]: W0310 15:13:24.249494 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode008807d_9026_49b7_9a83_c375cd1f23cc.slice/crio-6dcabe31c095e8bbf3c50cae6963494416f1b1c01f06783fec22d7d316be9f81 WatchSource:0}: Error finding container 6dcabe31c095e8bbf3c50cae6963494416f1b1c01f06783fec22d7d316be9f81: Status 404 returned error can't find the container with id 6dcabe31c095e8bbf3c50cae6963494416f1b1c01f06783fec22d7d316be9f81 Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.431192 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lx45d"] Mar 10 15:13:24 crc kubenswrapper[4743]: W0310 15:13:24.470734 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ffea72_0677_4f63_b0ba_5501881256da.slice/crio-d516ee31879103343a055c913c3c0e8aa58e5fd62f7e3029f87c2e1e1372d4c1 WatchSource:0}: Error finding container d516ee31879103343a055c913c3c0e8aa58e5fd62f7e3029f87c2e1e1372d4c1: Status 404 returned error can't find the container with id d516ee31879103343a055c913c3c0e8aa58e5fd62f7e3029f87c2e1e1372d4c1 Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.860030 4743 generic.go:334] "Generic (PLEG): container finished" podID="f6ffea72-0677-4f63-b0ba-5501881256da" containerID="3905fea5896026369a5ab34ded7a5d04b5d0b9cf861e173522ce3f56ebf63128" exitCode=0 Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.860141 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lx45d" event={"ID":"f6ffea72-0677-4f63-b0ba-5501881256da","Type":"ContainerDied","Data":"3905fea5896026369a5ab34ded7a5d04b5d0b9cf861e173522ce3f56ebf63128"} Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.860674 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lx45d" event={"ID":"f6ffea72-0677-4f63-b0ba-5501881256da","Type":"ContainerStarted","Data":"d516ee31879103343a055c913c3c0e8aa58e5fd62f7e3029f87c2e1e1372d4c1"} Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.864296 4743 generic.go:334] "Generic (PLEG): container finished" podID="e008807d-9026-49b7-9a83-c375cd1f23cc" containerID="56b528b31b6e4db115e294248aafc7e5ab346071b43949cad6d626fbb594aaa7" exitCode=0 Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.864395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrfb" event={"ID":"e008807d-9026-49b7-9a83-c375cd1f23cc","Type":"ContainerDied","Data":"56b528b31b6e4db115e294248aafc7e5ab346071b43949cad6d626fbb594aaa7"} Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.864439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrfb" event={"ID":"e008807d-9026-49b7-9a83-c375cd1f23cc","Type":"ContainerStarted","Data":"6dcabe31c095e8bbf3c50cae6963494416f1b1c01f06783fec22d7d316be9f81"} Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.867594 4743 generic.go:334] "Generic (PLEG): container finished" podID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerID="a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec" exitCode=0 Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.867682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvw9l" event={"ID":"190a3104-25c3-4135-bd63-b9e56380c9b9","Type":"ContainerDied","Data":"a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec"} Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.871557 4743 generic.go:334] "Generic (PLEG): container finished" podID="26d17452-34b9-4c1e-a95c-3840493fe263" containerID="4203017c0925fb8871515f12cdb0d2598298b019a0db4df27b2ae0c796bf874e" exitCode=0 Mar 10 15:13:24 crc kubenswrapper[4743]: I0310 15:13:24.871602 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wrsh" event={"ID":"26d17452-34b9-4c1e-a95c-3840493fe263","Type":"ContainerDied","Data":"4203017c0925fb8871515f12cdb0d2598298b019a0db4df27b2ae0c796bf874e"} Mar 10 15:13:25 crc kubenswrapper[4743]: I0310 15:13:25.882414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvw9l" event={"ID":"190a3104-25c3-4135-bd63-b9e56380c9b9","Type":"ContainerStarted","Data":"8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd"} Mar 10 15:13:25 crc kubenswrapper[4743]: I0310 15:13:25.885227 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wrsh" event={"ID":"26d17452-34b9-4c1e-a95c-3840493fe263","Type":"ContainerStarted","Data":"74053ac2588f48132a418a8f9f14168c75690432fb61dee8d694a236a4558cd2"} Mar 10 15:13:25 crc kubenswrapper[4743]: I0310 15:13:25.887843 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lx45d" event={"ID":"f6ffea72-0677-4f63-b0ba-5501881256da","Type":"ContainerStarted","Data":"95fa82d509eeefc3e5d2bd0b73e0380a0385a14ba5b1e5a251e0f2033ccfc53a"} Mar 10 15:13:25 crc kubenswrapper[4743]: I0310 15:13:25.890915 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrfb" event={"ID":"e008807d-9026-49b7-9a83-c375cd1f23cc","Type":"ContainerStarted","Data":"808d7e580b6aedba2ec7aa53d69680f616ceaed4f795ec3166679bd824ea7fea"} Mar 10 15:13:25 crc kubenswrapper[4743]: I0310 15:13:25.914027 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cvw9l" podStartSLOduration=2.4852514230000002 podStartE2EDuration="4.913996701s" podCreationTimestamp="2026-03-10 15:13:21 +0000 UTC" firstStartedPulling="2026-03-10 15:13:22.844217224 +0000 UTC m=+467.551031972" lastFinishedPulling="2026-03-10 15:13:25.272962502 +0000 UTC m=+469.979777250" observedRunningTime="2026-03-10 15:13:25.910918559 +0000 UTC m=+470.617733307" watchObservedRunningTime="2026-03-10 15:13:25.913996701 +0000 UTC m=+470.620811459" Mar 10 15:13:25 crc kubenswrapper[4743]: I0310 15:13:25.968287 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9wrsh" podStartSLOduration=2.524376276 podStartE2EDuration="4.968262544s" podCreationTimestamp="2026-03-10 15:13:21 +0000 UTC" firstStartedPulling="2026-03-10 15:13:22.839307998 +0000 UTC m=+467.546122756" lastFinishedPulling="2026-03-10 15:13:25.283194276 +0000 UTC m=+469.990009024" observedRunningTime="2026-03-10 15:13:25.964339587 +0000 UTC m=+470.671154335" watchObservedRunningTime="2026-03-10 15:13:25.968262544 +0000 UTC m=+470.675077292" Mar 10 15:13:26 crc kubenswrapper[4743]: I0310 15:13:26.898441 4743 generic.go:334] "Generic (PLEG): container finished" podID="f6ffea72-0677-4f63-b0ba-5501881256da" containerID="95fa82d509eeefc3e5d2bd0b73e0380a0385a14ba5b1e5a251e0f2033ccfc53a" exitCode=0 Mar 10 15:13:26 crc kubenswrapper[4743]: I0310 15:13:26.898536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lx45d" event={"ID":"f6ffea72-0677-4f63-b0ba-5501881256da","Type":"ContainerDied","Data":"95fa82d509eeefc3e5d2bd0b73e0380a0385a14ba5b1e5a251e0f2033ccfc53a"} Mar 10 15:13:26 crc kubenswrapper[4743]: I0310 15:13:26.902002 4743 generic.go:334] "Generic (PLEG): container finished" podID="e008807d-9026-49b7-9a83-c375cd1f23cc" containerID="808d7e580b6aedba2ec7aa53d69680f616ceaed4f795ec3166679bd824ea7fea" exitCode=0 Mar 10 15:13:26 crc kubenswrapper[4743]: I0310 15:13:26.902083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrfb" event={"ID":"e008807d-9026-49b7-9a83-c375cd1f23cc","Type":"ContainerDied","Data":"808d7e580b6aedba2ec7aa53d69680f616ceaed4f795ec3166679bd824ea7fea"} Mar 10 15:13:27 crc kubenswrapper[4743]: I0310 15:13:27.910278 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lx45d" event={"ID":"f6ffea72-0677-4f63-b0ba-5501881256da","Type":"ContainerStarted","Data":"868b6bd16c8458ce1ce7de9afc8541bf966f87c318f324449102d90dcf326d3a"} Mar 10 15:13:27 crc kubenswrapper[4743]: I0310 15:13:27.913892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrfb" event={"ID":"e008807d-9026-49b7-9a83-c375cd1f23cc","Type":"ContainerStarted","Data":"7e89c35d6b6722d4bdc746925343f8fa467be209a8246717b34bfd4319e0e254"} Mar 10 15:13:27 crc kubenswrapper[4743]: I0310 15:13:27.959322 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lx45d" podStartSLOduration=2.507244837 podStartE2EDuration="4.959293448s" podCreationTimestamp="2026-03-10 15:13:23 +0000 UTC" firstStartedPulling="2026-03-10 15:13:24.862685684 +0000 UTC m=+469.569500432" lastFinishedPulling="2026-03-10 15:13:27.314734295 +0000 UTC m=+472.021549043" observedRunningTime="2026-03-10 15:13:27.955916558 +0000 UTC m=+472.662731306" watchObservedRunningTime="2026-03-10 15:13:27.959293448 +0000 UTC m=+472.666108196" Mar 10 15:13:31 crc kubenswrapper[4743]: I0310 15:13:31.418420 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:31 crc kubenswrapper[4743]: I0310 15:13:31.418936 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:31 crc kubenswrapper[4743]: I0310 15:13:31.462647 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:31 crc kubenswrapper[4743]: I0310 15:13:31.483963 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2jrfb" podStartSLOduration=5.9774831299999995 podStartE2EDuration="8.483942678s" podCreationTimestamp="2026-03-10 15:13:23 +0000 UTC" firstStartedPulling="2026-03-10 15:13:24.870648351 +0000 UTC m=+469.577463099" lastFinishedPulling="2026-03-10 15:13:27.377107899 +0000 UTC m=+472.083922647" observedRunningTime="2026-03-10 15:13:27.982269191 +0000 UTC m=+472.689083949" watchObservedRunningTime="2026-03-10 15:13:31.483942678 +0000 UTC m=+476.190757426" Mar 10 15:13:31 crc kubenswrapper[4743]: I0310 15:13:31.616974 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:31 crc kubenswrapper[4743]: I0310 15:13:31.617075 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:31 crc kubenswrapper[4743]: I0310 15:13:31.991494 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9wrsh" Mar 10 15:13:32 crc kubenswrapper[4743]: I0310 15:13:32.658071 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cvw9l" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="registry-server" probeResult="failure" output=< Mar 10 15:13:32 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 15:13:32 crc kubenswrapper[4743]: > Mar 10 15:13:33 crc kubenswrapper[4743]: I0310 15:13:33.805246 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:33 crc kubenswrapper[4743]: I0310 15:13:33.805335 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:33 crc kubenswrapper[4743]: I0310 15:13:33.847501 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:33 crc kubenswrapper[4743]: I0310 15:13:33.999017 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2jrfb" Mar 10 15:13:34 crc kubenswrapper[4743]: I0310 15:13:34.007127 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:34 crc kubenswrapper[4743]: I0310 15:13:34.007198 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:34 crc kubenswrapper[4743]: I0310 15:13:34.056938 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:34 crc kubenswrapper[4743]: I0310 15:13:34.993554 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lx45d" Mar 10 15:13:35 crc kubenswrapper[4743]: I0310 15:13:35.840956 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" podUID="199e5a98-b472-45af-9088-ffe163ceba78" containerName="registry" containerID="cri-o://be0610a1dd3a80a5a4f87713d35cd0e21d06edc1979c61feb7fa273808d85f6c" gracePeriod=30 Mar 10 15:13:36 crc kubenswrapper[4743]: I0310 15:13:36.965761 4743 generic.go:334] "Generic (PLEG): container finished" podID="199e5a98-b472-45af-9088-ffe163ceba78" containerID="be0610a1dd3a80a5a4f87713d35cd0e21d06edc1979c61feb7fa273808d85f6c" exitCode=0 Mar 10 15:13:36 crc kubenswrapper[4743]: I0310 15:13:36.965997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" event={"ID":"199e5a98-b472-45af-9088-ffe163ceba78","Type":"ContainerDied","Data":"be0610a1dd3a80a5a4f87713d35cd0e21d06edc1979c61feb7fa273808d85f6c"} Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.008057 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.142620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pbnj\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-kube-api-access-4pbnj\") pod \"199e5a98-b472-45af-9088-ffe163ceba78\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.142680 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-bound-sa-token\") pod \"199e5a98-b472-45af-9088-ffe163ceba78\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.142736 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/199e5a98-b472-45af-9088-ffe163ceba78-ca-trust-extracted\") pod \"199e5a98-b472-45af-9088-ffe163ceba78\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.143091 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"199e5a98-b472-45af-9088-ffe163ceba78\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.143292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-registry-certificates\") pod \"199e5a98-b472-45af-9088-ffe163ceba78\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.143360 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/199e5a98-b472-45af-9088-ffe163ceba78-installation-pull-secrets\") pod \"199e5a98-b472-45af-9088-ffe163ceba78\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.143384 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-registry-tls\") pod \"199e5a98-b472-45af-9088-ffe163ceba78\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.143406 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-trusted-ca\") pod \"199e5a98-b472-45af-9088-ffe163ceba78\" (UID: \"199e5a98-b472-45af-9088-ffe163ceba78\") " Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.144376 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "199e5a98-b472-45af-9088-ffe163ceba78" (UID: "199e5a98-b472-45af-9088-ffe163ceba78"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.144562 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "199e5a98-b472-45af-9088-ffe163ceba78" (UID: "199e5a98-b472-45af-9088-ffe163ceba78"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.156352 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "199e5a98-b472-45af-9088-ffe163ceba78" (UID: "199e5a98-b472-45af-9088-ffe163ceba78"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.158393 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "199e5a98-b472-45af-9088-ffe163ceba78" (UID: "199e5a98-b472-45af-9088-ffe163ceba78"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.159172 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-kube-api-access-4pbnj" (OuterVolumeSpecName: "kube-api-access-4pbnj") pod "199e5a98-b472-45af-9088-ffe163ceba78" (UID: "199e5a98-b472-45af-9088-ffe163ceba78"). InnerVolumeSpecName "kube-api-access-4pbnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.159500 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199e5a98-b472-45af-9088-ffe163ceba78-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "199e5a98-b472-45af-9088-ffe163ceba78" (UID: "199e5a98-b472-45af-9088-ffe163ceba78"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.160246 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e5a98-b472-45af-9088-ffe163ceba78-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "199e5a98-b472-45af-9088-ffe163ceba78" (UID: "199e5a98-b472-45af-9088-ffe163ceba78"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.180462 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "199e5a98-b472-45af-9088-ffe163ceba78" (UID: "199e5a98-b472-45af-9088-ffe163ceba78"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.244586 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.244632 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.244643 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/199e5a98-b472-45af-9088-ffe163ceba78-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.244652 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/199e5a98-b472-45af-9088-ffe163ceba78-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.244660 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pbnj\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-kube-api-access-4pbnj\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.244677 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/199e5a98-b472-45af-9088-ffe163ceba78-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.244687 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/199e5a98-b472-45af-9088-ffe163ceba78-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.973846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" event={"ID":"199e5a98-b472-45af-9088-ffe163ceba78","Type":"ContainerDied","Data":"0450bca02e49efd41f9282c9cc97de58da1e5fc154f2db307dc27f87a468cb3e"} Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.973928 4743 scope.go:117] "RemoveContainer" containerID="be0610a1dd3a80a5a4f87713d35cd0e21d06edc1979c61feb7fa273808d85f6c" Mar 10 15:13:37 crc kubenswrapper[4743]: I0310 15:13:37.973922 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mxlth" Mar 10 15:13:38 crc kubenswrapper[4743]: I0310 15:13:38.000610 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mxlth"] Mar 10 15:13:38 crc kubenswrapper[4743]: I0310 15:13:38.007214 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mxlth"] Mar 10 15:13:39 crc kubenswrapper[4743]: I0310 15:13:39.922543 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199e5a98-b472-45af-9088-ffe163ceba78" path="/var/lib/kubelet/pods/199e5a98-b472-45af-9088-ffe163ceba78/volumes" Mar 10 15:13:41 crc kubenswrapper[4743]: I0310 15:13:41.252852 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:13:41 crc kubenswrapper[4743]: I0310 15:13:41.252958 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:13:41 crc kubenswrapper[4743]: I0310 15:13:41.669160 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:41 crc kubenswrapper[4743]: I0310 15:13:41.712505 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 15:13:43 crc kubenswrapper[4743]: I0310 15:13:43.620618 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.138772 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552594-g99n7"] Mar 10 15:14:00 crc kubenswrapper[4743]: E0310 15:14:00.140024 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199e5a98-b472-45af-9088-ffe163ceba78" containerName="registry" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.140052 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="199e5a98-b472-45af-9088-ffe163ceba78" containerName="registry" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.140149 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="199e5a98-b472-45af-9088-ffe163ceba78" containerName="registry" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.140639 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-g99n7" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.145357 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.145489 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.145713 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.150708 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-g99n7"] Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.277741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8khj\" (UniqueName: \"kubernetes.io/projected/665e9938-cfcb-45ff-8bb8-158fb0c7d2d9-kube-api-access-c8khj\") pod \"auto-csr-approver-29552594-g99n7\" (UID: \"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9\") " pod="openshift-infra/auto-csr-approver-29552594-g99n7" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.379837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8khj\" (UniqueName: \"kubernetes.io/projected/665e9938-cfcb-45ff-8bb8-158fb0c7d2d9-kube-api-access-c8khj\") pod \"auto-csr-approver-29552594-g99n7\" (UID: \"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9\") " pod="openshift-infra/auto-csr-approver-29552594-g99n7" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.403857 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8khj\" (UniqueName: \"kubernetes.io/projected/665e9938-cfcb-45ff-8bb8-158fb0c7d2d9-kube-api-access-c8khj\") pod \"auto-csr-approver-29552594-g99n7\" (UID: \"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9\") " pod="openshift-infra/auto-csr-approver-29552594-g99n7" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.467950 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-g99n7" Mar 10 15:14:00 crc kubenswrapper[4743]: I0310 15:14:00.892763 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-g99n7"] Mar 10 15:14:01 crc kubenswrapper[4743]: I0310 15:14:01.107696 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-g99n7" event={"ID":"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9","Type":"ContainerStarted","Data":"4f037368f6070da93faf78c5f1cfaf7ca18e587d53539d84bc5ec018afeed8d8"} Mar 10 15:14:02 crc kubenswrapper[4743]: I0310 15:14:02.138083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-g99n7" event={"ID":"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9","Type":"ContainerStarted","Data":"781be795c7e4e4a724afed1359bbfca1a4b78fdb535b5330a50853c2c57638f4"} Mar 10 15:14:02 crc kubenswrapper[4743]: I0310 15:14:02.154885 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552594-g99n7" podStartSLOduration=1.209162549 podStartE2EDuration="2.154863765s" podCreationTimestamp="2026-03-10 15:14:00 +0000 UTC" firstStartedPulling="2026-03-10 15:14:00.896193834 +0000 UTC m=+505.603008622" lastFinishedPulling="2026-03-10 15:14:01.84189509 +0000 UTC m=+506.548709838" observedRunningTime="2026-03-10 15:14:02.154373691 +0000 UTC m=+506.861188439" watchObservedRunningTime="2026-03-10 15:14:02.154863765 +0000 UTC m=+506.861678533" Mar 10 15:14:03 crc kubenswrapper[4743]: I0310 15:14:03.146435 4743 generic.go:334] "Generic (PLEG): container finished" podID="665e9938-cfcb-45ff-8bb8-158fb0c7d2d9" containerID="781be795c7e4e4a724afed1359bbfca1a4b78fdb535b5330a50853c2c57638f4" exitCode=0 Mar 10 15:14:03 crc kubenswrapper[4743]: I0310 15:14:03.146503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-g99n7" event={"ID":"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9","Type":"ContainerDied","Data":"781be795c7e4e4a724afed1359bbfca1a4b78fdb535b5330a50853c2c57638f4"} Mar 10 15:14:04 crc kubenswrapper[4743]: I0310 15:14:04.379363 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-g99n7" Mar 10 15:14:04 crc kubenswrapper[4743]: I0310 15:14:04.434474 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8khj\" (UniqueName: \"kubernetes.io/projected/665e9938-cfcb-45ff-8bb8-158fb0c7d2d9-kube-api-access-c8khj\") pod \"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9\" (UID: \"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9\") " Mar 10 15:14:04 crc kubenswrapper[4743]: I0310 15:14:04.441929 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665e9938-cfcb-45ff-8bb8-158fb0c7d2d9-kube-api-access-c8khj" (OuterVolumeSpecName: "kube-api-access-c8khj") pod "665e9938-cfcb-45ff-8bb8-158fb0c7d2d9" (UID: "665e9938-cfcb-45ff-8bb8-158fb0c7d2d9"). InnerVolumeSpecName "kube-api-access-c8khj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:14:04 crc kubenswrapper[4743]: I0310 15:14:04.536690 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8khj\" (UniqueName: \"kubernetes.io/projected/665e9938-cfcb-45ff-8bb8-158fb0c7d2d9-kube-api-access-c8khj\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:05 crc kubenswrapper[4743]: I0310 15:14:05.160423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-g99n7" event={"ID":"665e9938-cfcb-45ff-8bb8-158fb0c7d2d9","Type":"ContainerDied","Data":"4f037368f6070da93faf78c5f1cfaf7ca18e587d53539d84bc5ec018afeed8d8"} Mar 10 15:14:05 crc kubenswrapper[4743]: I0310 15:14:05.160492 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f037368f6070da93faf78c5f1cfaf7ca18e587d53539d84bc5ec018afeed8d8" Mar 10 15:14:05 crc kubenswrapper[4743]: I0310 15:14:05.160533 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-g99n7" Mar 10 15:14:05 crc kubenswrapper[4743]: I0310 15:14:05.222116 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-mtmtv"] Mar 10 15:14:05 crc kubenswrapper[4743]: I0310 15:14:05.227555 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-mtmtv"] Mar 10 15:14:05 crc kubenswrapper[4743]: I0310 15:14:05.926041 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7837cec9-3686-497f-b9ec-2525768cd8ce" path="/var/lib/kubelet/pods/7837cec9-3686-497f-b9ec-2525768cd8ce/volumes" Mar 10 15:14:11 crc kubenswrapper[4743]: I0310 15:14:11.252487 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:14:11 crc kubenswrapper[4743]: I0310 15:14:11.252973 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:14:11 crc kubenswrapper[4743]: I0310 15:14:11.253081 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:14:11 crc kubenswrapper[4743]: I0310 15:14:11.253729 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03018d7637364bc245db63b69660e40a53f14d3f7016873be1f03ec6299ce4d5"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:14:11 crc kubenswrapper[4743]: I0310 15:14:11.254162 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://03018d7637364bc245db63b69660e40a53f14d3f7016873be1f03ec6299ce4d5" gracePeriod=600 Mar 10 15:14:12 crc kubenswrapper[4743]: I0310 15:14:12.207411 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="03018d7637364bc245db63b69660e40a53f14d3f7016873be1f03ec6299ce4d5" exitCode=0 Mar 10 15:14:12 crc kubenswrapper[4743]: I0310 15:14:12.207484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"03018d7637364bc245db63b69660e40a53f14d3f7016873be1f03ec6299ce4d5"} Mar 10 15:14:12 crc kubenswrapper[4743]: I0310 15:14:12.208493 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"485a71da308cafc31c75bf433fce757dfcdc428af1dc1ad1dec2295c6a5710ee"} Mar 10 15:14:12 crc kubenswrapper[4743]: I0310 15:14:12.208590 4743 scope.go:117] "RemoveContainer" containerID="b89445e203720353c889695431d10f5cd1ff53c805bb1460cc7903465d322fb2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.140948 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2"] Mar 10 15:15:00 crc kubenswrapper[4743]: E0310 15:15:00.142009 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="665e9938-cfcb-45ff-8bb8-158fb0c7d2d9" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.142027 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="665e9938-cfcb-45ff-8bb8-158fb0c7d2d9" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.142191 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="665e9938-cfcb-45ff-8bb8-158fb0c7d2d9" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.142753 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.148160 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.148212 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.172984 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2"] Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.264848 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af8e51-8ca1-41a0-9abc-3c7c33e01000-secret-volume\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.264998 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af8e51-8ca1-41a0-9abc-3c7c33e01000-config-volume\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.265032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wknf9\" (UniqueName: \"kubernetes.io/projected/79af8e51-8ca1-41a0-9abc-3c7c33e01000-kube-api-access-wknf9\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.366660 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af8e51-8ca1-41a0-9abc-3c7c33e01000-secret-volume\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.366736 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af8e51-8ca1-41a0-9abc-3c7c33e01000-config-volume\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.366762 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wknf9\" (UniqueName: \"kubernetes.io/projected/79af8e51-8ca1-41a0-9abc-3c7c33e01000-kube-api-access-wknf9\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.368034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af8e51-8ca1-41a0-9abc-3c7c33e01000-config-volume\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.376861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af8e51-8ca1-41a0-9abc-3c7c33e01000-secret-volume\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.384676 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wknf9\" (UniqueName: \"kubernetes.io/projected/79af8e51-8ca1-41a0-9abc-3c7c33e01000-kube-api-access-wknf9\") pod \"collect-profiles-29552595-vq8s2\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.466477 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:00 crc kubenswrapper[4743]: I0310 15:15:00.656405 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2"] Mar 10 15:15:01 crc kubenswrapper[4743]: I0310 15:15:01.665121 4743 generic.go:334] "Generic (PLEG): container finished" podID="79af8e51-8ca1-41a0-9abc-3c7c33e01000" containerID="557c942064107673d0064b67df108324afaaa44b13040191c8d64bc30024bea0" exitCode=0 Mar 10 15:15:01 crc kubenswrapper[4743]: I0310 15:15:01.665235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" event={"ID":"79af8e51-8ca1-41a0-9abc-3c7c33e01000","Type":"ContainerDied","Data":"557c942064107673d0064b67df108324afaaa44b13040191c8d64bc30024bea0"} Mar 10 15:15:01 crc kubenswrapper[4743]: I0310 15:15:01.665478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" event={"ID":"79af8e51-8ca1-41a0-9abc-3c7c33e01000","Type":"ContainerStarted","Data":"85846d11c356ec30871bd6269f8f0f2872083fc183eacb24a7858e216022901a"} Mar 10 15:15:02 crc kubenswrapper[4743]: I0310 15:15:02.897619 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.004436 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af8e51-8ca1-41a0-9abc-3c7c33e01000-config-volume\") pod \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.004513 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wknf9\" (UniqueName: \"kubernetes.io/projected/79af8e51-8ca1-41a0-9abc-3c7c33e01000-kube-api-access-wknf9\") pod \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.004572 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af8e51-8ca1-41a0-9abc-3c7c33e01000-secret-volume\") pod \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\" (UID: \"79af8e51-8ca1-41a0-9abc-3c7c33e01000\") " Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.006209 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79af8e51-8ca1-41a0-9abc-3c7c33e01000-config-volume" (OuterVolumeSpecName: "config-volume") pod "79af8e51-8ca1-41a0-9abc-3c7c33e01000" (UID: "79af8e51-8ca1-41a0-9abc-3c7c33e01000"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.011369 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79af8e51-8ca1-41a0-9abc-3c7c33e01000-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "79af8e51-8ca1-41a0-9abc-3c7c33e01000" (UID: "79af8e51-8ca1-41a0-9abc-3c7c33e01000"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.011668 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79af8e51-8ca1-41a0-9abc-3c7c33e01000-kube-api-access-wknf9" (OuterVolumeSpecName: "kube-api-access-wknf9") pod "79af8e51-8ca1-41a0-9abc-3c7c33e01000" (UID: "79af8e51-8ca1-41a0-9abc-3c7c33e01000"). InnerVolumeSpecName "kube-api-access-wknf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.106757 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af8e51-8ca1-41a0-9abc-3c7c33e01000-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.106876 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wknf9\" (UniqueName: \"kubernetes.io/projected/79af8e51-8ca1-41a0-9abc-3c7c33e01000-kube-api-access-wknf9\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.106918 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af8e51-8ca1-41a0-9abc-3c7c33e01000-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.687995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" event={"ID":"79af8e51-8ca1-41a0-9abc-3c7c33e01000","Type":"ContainerDied","Data":"85846d11c356ec30871bd6269f8f0f2872083fc183eacb24a7858e216022901a"} Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.688046 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85846d11c356ec30871bd6269f8f0f2872083fc183eacb24a7858e216022901a" Mar 10 15:15:03 crc kubenswrapper[4743]: I0310 15:15:03.688065 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.141360 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552596-8pxvn"] Mar 10 15:16:00 crc kubenswrapper[4743]: E0310 15:16:00.142305 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79af8e51-8ca1-41a0-9abc-3c7c33e01000" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.142321 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="79af8e51-8ca1-41a0-9abc-3c7c33e01000" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.142439 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="79af8e51-8ca1-41a0-9abc-3c7c33e01000" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.142935 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-8pxvn" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.146295 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.151080 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-8pxvn"] Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.151261 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.151274 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.290013 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9j6x\" (UniqueName: \"kubernetes.io/projected/8036ca91-3733-4e07-a583-4265f162c6ed-kube-api-access-n9j6x\") pod \"auto-csr-approver-29552596-8pxvn\" (UID: \"8036ca91-3733-4e07-a583-4265f162c6ed\") " pod="openshift-infra/auto-csr-approver-29552596-8pxvn" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.391368 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9j6x\" (UniqueName: \"kubernetes.io/projected/8036ca91-3733-4e07-a583-4265f162c6ed-kube-api-access-n9j6x\") pod \"auto-csr-approver-29552596-8pxvn\" (UID: \"8036ca91-3733-4e07-a583-4265f162c6ed\") " pod="openshift-infra/auto-csr-approver-29552596-8pxvn" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.415068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9j6x\" (UniqueName: \"kubernetes.io/projected/8036ca91-3733-4e07-a583-4265f162c6ed-kube-api-access-n9j6x\") pod \"auto-csr-approver-29552596-8pxvn\" (UID: \"8036ca91-3733-4e07-a583-4265f162c6ed\") " pod="openshift-infra/auto-csr-approver-29552596-8pxvn" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.471780 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-8pxvn" Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.695874 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-8pxvn"] Mar 10 15:16:00 crc kubenswrapper[4743]: I0310 15:16:00.706119 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:16:01 crc kubenswrapper[4743]: I0310 15:16:01.071865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-8pxvn" event={"ID":"8036ca91-3733-4e07-a583-4265f162c6ed","Type":"ContainerStarted","Data":"0c5e8a8fb003dd8e6781bf65b5522dc89b9078760fa46e237fbd51a7162396d5"} Mar 10 15:16:02 crc kubenswrapper[4743]: I0310 15:16:02.086593 4743 generic.go:334] "Generic (PLEG): container finished" podID="8036ca91-3733-4e07-a583-4265f162c6ed" containerID="f0791db09e5053e2242de874db3e1d91429bdfcf4f6251eacff0ff41ad552725" exitCode=0 Mar 10 15:16:02 crc kubenswrapper[4743]: I0310 15:16:02.086650 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-8pxvn" event={"ID":"8036ca91-3733-4e07-a583-4265f162c6ed","Type":"ContainerDied","Data":"f0791db09e5053e2242de874db3e1d91429bdfcf4f6251eacff0ff41ad552725"} Mar 10 15:16:03 crc kubenswrapper[4743]: I0310 15:16:03.335500 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-8pxvn" Mar 10 15:16:03 crc kubenswrapper[4743]: I0310 15:16:03.432613 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9j6x\" (UniqueName: \"kubernetes.io/projected/8036ca91-3733-4e07-a583-4265f162c6ed-kube-api-access-n9j6x\") pod \"8036ca91-3733-4e07-a583-4265f162c6ed\" (UID: \"8036ca91-3733-4e07-a583-4265f162c6ed\") " Mar 10 15:16:03 crc kubenswrapper[4743]: I0310 15:16:03.441729 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8036ca91-3733-4e07-a583-4265f162c6ed-kube-api-access-n9j6x" (OuterVolumeSpecName: "kube-api-access-n9j6x") pod "8036ca91-3733-4e07-a583-4265f162c6ed" (UID: "8036ca91-3733-4e07-a583-4265f162c6ed"). InnerVolumeSpecName "kube-api-access-n9j6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:16:03 crc kubenswrapper[4743]: I0310 15:16:03.534405 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9j6x\" (UniqueName: \"kubernetes.io/projected/8036ca91-3733-4e07-a583-4265f162c6ed-kube-api-access-n9j6x\") on node \"crc\" DevicePath \"\"" Mar 10 15:16:04 crc kubenswrapper[4743]: I0310 15:16:04.100176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-8pxvn" event={"ID":"8036ca91-3733-4e07-a583-4265f162c6ed","Type":"ContainerDied","Data":"0c5e8a8fb003dd8e6781bf65b5522dc89b9078760fa46e237fbd51a7162396d5"} Mar 10 15:16:04 crc kubenswrapper[4743]: I0310 15:16:04.100221 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c5e8a8fb003dd8e6781bf65b5522dc89b9078760fa46e237fbd51a7162396d5" Mar 10 15:16:04 crc kubenswrapper[4743]: I0310 15:16:04.100264 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-8pxvn" Mar 10 15:16:04 crc kubenswrapper[4743]: I0310 15:16:04.398489 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2bvk6"] Mar 10 15:16:04 crc kubenswrapper[4743]: I0310 15:16:04.403218 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2bvk6"] Mar 10 15:16:05 crc kubenswrapper[4743]: I0310 15:16:05.923769 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6ef377-422d-42a7-aedb-5adad149a2bc" path="/var/lib/kubelet/pods/ac6ef377-422d-42a7-aedb-5adad149a2bc/volumes" Mar 10 15:16:11 crc kubenswrapper[4743]: I0310 15:16:11.253110 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:16:11 crc kubenswrapper[4743]: I0310 15:16:11.253636 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:16:36 crc kubenswrapper[4743]: I0310 15:16:36.317594 4743 scope.go:117] "RemoveContainer" containerID="3d3978e5601da2327dd3b88d62099deb80fe00f1900a29a2184ec135db7f1b97" Mar 10 15:16:36 crc kubenswrapper[4743]: I0310 15:16:36.354552 4743 scope.go:117] "RemoveContainer" containerID="2ae4666d4e659876cfcac4a569c66230e6faee8716566c2c5fe84bb6cb8bc3f1" Mar 10 15:16:41 crc kubenswrapper[4743]: I0310 15:16:41.253101 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:16:41 crc kubenswrapper[4743]: I0310 15:16:41.253760 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:17:11 crc kubenswrapper[4743]: I0310 15:17:11.252999 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:17:11 crc kubenswrapper[4743]: I0310 15:17:11.253938 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:17:11 crc kubenswrapper[4743]: I0310 15:17:11.254013 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:17:11 crc kubenswrapper[4743]: I0310 15:17:11.254996 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"485a71da308cafc31c75bf433fce757dfcdc428af1dc1ad1dec2295c6a5710ee"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:17:11 crc kubenswrapper[4743]: I0310 15:17:11.255104 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://485a71da308cafc31c75bf433fce757dfcdc428af1dc1ad1dec2295c6a5710ee" gracePeriod=600 Mar 10 15:17:11 crc kubenswrapper[4743]: I0310 15:17:11.528472 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="485a71da308cafc31c75bf433fce757dfcdc428af1dc1ad1dec2295c6a5710ee" exitCode=0 Mar 10 15:17:11 crc kubenswrapper[4743]: I0310 15:17:11.528536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"485a71da308cafc31c75bf433fce757dfcdc428af1dc1ad1dec2295c6a5710ee"} Mar 10 15:17:11 crc kubenswrapper[4743]: I0310 15:17:11.528593 4743 scope.go:117] "RemoveContainer" containerID="03018d7637364bc245db63b69660e40a53f14d3f7016873be1f03ec6299ce4d5" Mar 10 15:17:12 crc kubenswrapper[4743]: I0310 15:17:12.537063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"97d590bdb55b8d3eeaf67e203c3704815de1593197109afcafe553653c1d6c9f"} Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.140655 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552598-65srp"] Mar 10 15:18:00 crc kubenswrapper[4743]: E0310 15:18:00.141632 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8036ca91-3733-4e07-a583-4265f162c6ed" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.141654 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8036ca91-3733-4e07-a583-4265f162c6ed" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.141773 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8036ca91-3733-4e07-a583-4265f162c6ed" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.142265 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-65srp" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.144266 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.144621 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.144947 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.154752 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-65srp"] Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.307988 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7klts\" (UniqueName: \"kubernetes.io/projected/ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4-kube-api-access-7klts\") pod \"auto-csr-approver-29552598-65srp\" (UID: \"ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4\") " pod="openshift-infra/auto-csr-approver-29552598-65srp" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.409294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7klts\" (UniqueName: \"kubernetes.io/projected/ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4-kube-api-access-7klts\") pod \"auto-csr-approver-29552598-65srp\" (UID: \"ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4\") " pod="openshift-infra/auto-csr-approver-29552598-65srp" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.444965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7klts\" (UniqueName: \"kubernetes.io/projected/ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4-kube-api-access-7klts\") pod \"auto-csr-approver-29552598-65srp\" (UID: \"ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4\") " pod="openshift-infra/auto-csr-approver-29552598-65srp" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.503109 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-65srp" Mar 10 15:18:00 crc kubenswrapper[4743]: I0310 15:18:00.967275 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-65srp"] Mar 10 15:18:01 crc kubenswrapper[4743]: I0310 15:18:01.874280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-65srp" event={"ID":"ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4","Type":"ContainerStarted","Data":"844e466c2f02c87959def2c3487f9cab8952e9e4f69338943f5b696c6ae5a25a"} Mar 10 15:18:02 crc kubenswrapper[4743]: I0310 15:18:02.883002 4743 generic.go:334] "Generic (PLEG): container finished" podID="ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4" containerID="e0c14e8af7b025d9958f803b3c3e43d97c2b88ed0721b93a673f950b69b7fab8" exitCode=0 Mar 10 15:18:02 crc kubenswrapper[4743]: I0310 15:18:02.883091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-65srp" event={"ID":"ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4","Type":"ContainerDied","Data":"e0c14e8af7b025d9958f803b3c3e43d97c2b88ed0721b93a673f950b69b7fab8"} Mar 10 15:18:04 crc kubenswrapper[4743]: I0310 15:18:04.197842 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-65srp" Mar 10 15:18:04 crc kubenswrapper[4743]: I0310 15:18:04.358005 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7klts\" (UniqueName: \"kubernetes.io/projected/ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4-kube-api-access-7klts\") pod \"ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4\" (UID: \"ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4\") " Mar 10 15:18:04 crc kubenswrapper[4743]: I0310 15:18:04.366056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4-kube-api-access-7klts" (OuterVolumeSpecName: "kube-api-access-7klts") pod "ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4" (UID: "ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4"). InnerVolumeSpecName "kube-api-access-7klts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:18:04 crc kubenswrapper[4743]: I0310 15:18:04.460032 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7klts\" (UniqueName: \"kubernetes.io/projected/ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4-kube-api-access-7klts\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:04 crc kubenswrapper[4743]: I0310 15:18:04.896917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-65srp" event={"ID":"ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4","Type":"ContainerDied","Data":"844e466c2f02c87959def2c3487f9cab8952e9e4f69338943f5b696c6ae5a25a"} Mar 10 15:18:04 crc kubenswrapper[4743]: I0310 15:18:04.896973 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844e466c2f02c87959def2c3487f9cab8952e9e4f69338943f5b696c6ae5a25a" Mar 10 15:18:04 crc kubenswrapper[4743]: I0310 15:18:04.897074 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-65srp" Mar 10 15:18:05 crc kubenswrapper[4743]: I0310 15:18:05.278130 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-mmgm4"] Mar 10 15:18:05 crc kubenswrapper[4743]: I0310 15:18:05.288116 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-mmgm4"] Mar 10 15:18:05 crc kubenswrapper[4743]: I0310 15:18:05.928155 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7d8aeb-2177-44ee-b7e0-d745883183af" path="/var/lib/kubelet/pods/9c7d8aeb-2177-44ee-b7e0-d745883183af/volumes" Mar 10 15:18:36 crc kubenswrapper[4743]: I0310 15:18:36.425037 4743 scope.go:117] "RemoveContainer" containerID="664574cde011ed7c51cca1b801dd20432041002097974fd6bc518261037628c6" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.949839 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2"] Mar 10 15:18:40 crc kubenswrapper[4743]: E0310 15:18:40.950889 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4" containerName="oc" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.950917 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4" containerName="oc" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.951126 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4" containerName="oc" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.951846 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.952828 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-8png4"] Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.953631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-8png4" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.954096 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.954541 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zgmml" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.955026 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-c5gbj" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.971305 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-8png4"] Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.972671 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.974718 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2"] Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.977762 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-65ln8"] Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.978501 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvdzg\" (UniqueName: \"kubernetes.io/projected/7b482c75-7f98-46ad-8ad5-ff3df46f8965-kube-api-access-xvdzg\") pod \"cert-manager-cainjector-cf98fcc89-2p4r2\" (UID: \"7b482c75-7f98-46ad-8ad5-ff3df46f8965\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.978567 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.978586 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vskdv\" (UniqueName: \"kubernetes.io/projected/5a8813f8-fe38-4cb2-a737-ac9e4abce6a9-kube-api-access-vskdv\") pod \"cert-manager-858654f9db-8png4\" (UID: \"5a8813f8-fe38-4cb2-a737-ac9e4abce6a9\") " pod="cert-manager/cert-manager-858654f9db-8png4" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.980134 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-sxbpg" Mar 10 15:18:40 crc kubenswrapper[4743]: I0310 15:18:40.988403 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-65ln8"] Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.080502 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvdzg\" (UniqueName: \"kubernetes.io/projected/7b482c75-7f98-46ad-8ad5-ff3df46f8965-kube-api-access-xvdzg\") pod \"cert-manager-cainjector-cf98fcc89-2p4r2\" (UID: \"7b482c75-7f98-46ad-8ad5-ff3df46f8965\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.080603 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7khc\" (UniqueName: \"kubernetes.io/projected/5d7842a5-2d26-49cc-b4bf-c5afec234f08-kube-api-access-r7khc\") pod \"cert-manager-webhook-687f57d79b-65ln8\" (UID: \"5d7842a5-2d26-49cc-b4bf-c5afec234f08\") " pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.080647 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vskdv\" (UniqueName: \"kubernetes.io/projected/5a8813f8-fe38-4cb2-a737-ac9e4abce6a9-kube-api-access-vskdv\") pod \"cert-manager-858654f9db-8png4\" (UID: \"5a8813f8-fe38-4cb2-a737-ac9e4abce6a9\") " pod="cert-manager/cert-manager-858654f9db-8png4" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.102272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vskdv\" (UniqueName: \"kubernetes.io/projected/5a8813f8-fe38-4cb2-a737-ac9e4abce6a9-kube-api-access-vskdv\") pod \"cert-manager-858654f9db-8png4\" (UID: \"5a8813f8-fe38-4cb2-a737-ac9e4abce6a9\") " pod="cert-manager/cert-manager-858654f9db-8png4" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.102328 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvdzg\" (UniqueName: \"kubernetes.io/projected/7b482c75-7f98-46ad-8ad5-ff3df46f8965-kube-api-access-xvdzg\") pod \"cert-manager-cainjector-cf98fcc89-2p4r2\" (UID: \"7b482c75-7f98-46ad-8ad5-ff3df46f8965\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.182196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7khc\" (UniqueName: \"kubernetes.io/projected/5d7842a5-2d26-49cc-b4bf-c5afec234f08-kube-api-access-r7khc\") pod \"cert-manager-webhook-687f57d79b-65ln8\" (UID: \"5d7842a5-2d26-49cc-b4bf-c5afec234f08\") " pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.203903 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7khc\" (UniqueName: \"kubernetes.io/projected/5d7842a5-2d26-49cc-b4bf-c5afec234f08-kube-api-access-r7khc\") pod \"cert-manager-webhook-687f57d79b-65ln8\" (UID: \"5d7842a5-2d26-49cc-b4bf-c5afec234f08\") " pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.269287 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.279003 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-8png4" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.291205 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.589239 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-65ln8"] Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.729905 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-8png4"] Mar 10 15:18:41 crc kubenswrapper[4743]: I0310 15:18:41.739716 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2"] Mar 10 15:18:41 crc kubenswrapper[4743]: W0310 15:18:41.740177 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a8813f8_fe38_4cb2_a737_ac9e4abce6a9.slice/crio-480849fc72870ca0fa199c491c2fb7e504f8ba54bed34f1a482c2733db84c865 WatchSource:0}: Error finding container 480849fc72870ca0fa199c491c2fb7e504f8ba54bed34f1a482c2733db84c865: Status 404 returned error can't find the container with id 480849fc72870ca0fa199c491c2fb7e504f8ba54bed34f1a482c2733db84c865 Mar 10 15:18:41 crc kubenswrapper[4743]: W0310 15:18:41.749451 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b482c75_7f98_46ad_8ad5_ff3df46f8965.slice/crio-8cde36848457161a2d79b542abfd7536a21877cfd0ab168d397121093bcb9fc5 WatchSource:0}: Error finding container 8cde36848457161a2d79b542abfd7536a21877cfd0ab168d397121093bcb9fc5: Status 404 returned error can't find the container with id 8cde36848457161a2d79b542abfd7536a21877cfd0ab168d397121093bcb9fc5 Mar 10 15:18:42 crc kubenswrapper[4743]: I0310 15:18:42.140928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-8png4" event={"ID":"5a8813f8-fe38-4cb2-a737-ac9e4abce6a9","Type":"ContainerStarted","Data":"480849fc72870ca0fa199c491c2fb7e504f8ba54bed34f1a482c2733db84c865"} Mar 10 15:18:42 crc kubenswrapper[4743]: I0310 15:18:42.142569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" event={"ID":"5d7842a5-2d26-49cc-b4bf-c5afec234f08","Type":"ContainerStarted","Data":"dc9976d503fbd00d3a60450bedef2b7d5e9af2cd64358bb7db693a0631575b04"} Mar 10 15:18:42 crc kubenswrapper[4743]: I0310 15:18:42.144065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2" event={"ID":"7b482c75-7f98-46ad-8ad5-ff3df46f8965","Type":"ContainerStarted","Data":"8cde36848457161a2d79b542abfd7536a21877cfd0ab168d397121093bcb9fc5"} Mar 10 15:18:44 crc kubenswrapper[4743]: I0310 15:18:44.162606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" event={"ID":"5d7842a5-2d26-49cc-b4bf-c5afec234f08","Type":"ContainerStarted","Data":"c541418c300c1b846cd9a20ab1fe8d09f979838655d0a3b55284560ccfde29ac"} Mar 10 15:18:44 crc kubenswrapper[4743]: I0310 15:18:44.163610 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" Mar 10 15:18:44 crc kubenswrapper[4743]: I0310 15:18:44.187505 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" podStartSLOduration=1.856626163 podStartE2EDuration="4.187472284s" podCreationTimestamp="2026-03-10 15:18:40 +0000 UTC" firstStartedPulling="2026-03-10 15:18:41.599261163 +0000 UTC m=+786.306075911" lastFinishedPulling="2026-03-10 15:18:43.930107284 +0000 UTC m=+788.636922032" observedRunningTime="2026-03-10 15:18:44.178927981 +0000 UTC m=+788.885742729" watchObservedRunningTime="2026-03-10 15:18:44.187472284 +0000 UTC m=+788.894287042" Mar 10 15:18:46 crc kubenswrapper[4743]: I0310 15:18:46.172833 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2" event={"ID":"7b482c75-7f98-46ad-8ad5-ff3df46f8965","Type":"ContainerStarted","Data":"1abafc9a5d1bc95bccd311a402cbadeb70017dd0a9ac24f3a6d556e7f47eccb9"} Mar 10 15:18:46 crc kubenswrapper[4743]: I0310 15:18:46.176088 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-8png4" event={"ID":"5a8813f8-fe38-4cb2-a737-ac9e4abce6a9","Type":"ContainerStarted","Data":"07a8c8210e9d106190f0495bc23c1eb1b5582590d6bcead5af976bb7e3685347"} Mar 10 15:18:46 crc kubenswrapper[4743]: I0310 15:18:46.193929 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2p4r2" podStartSLOduration=2.6098761059999998 podStartE2EDuration="6.193905888s" podCreationTimestamp="2026-03-10 15:18:40 +0000 UTC" firstStartedPulling="2026-03-10 15:18:41.752379318 +0000 UTC m=+786.459194066" lastFinishedPulling="2026-03-10 15:18:45.33640908 +0000 UTC m=+790.043223848" observedRunningTime="2026-03-10 15:18:46.19150674 +0000 UTC m=+790.898321498" watchObservedRunningTime="2026-03-10 15:18:46.193905888 +0000 UTC m=+790.900720636" Mar 10 15:18:46 crc kubenswrapper[4743]: I0310 15:18:46.224424 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-8png4" podStartSLOduration=2.574415938 podStartE2EDuration="6.224396505s" podCreationTimestamp="2026-03-10 15:18:40 +0000 UTC" firstStartedPulling="2026-03-10 15:18:41.745118432 +0000 UTC m=+786.451933170" lastFinishedPulling="2026-03-10 15:18:45.395098989 +0000 UTC m=+790.101913737" observedRunningTime="2026-03-10 15:18:46.220378871 +0000 UTC m=+790.927193629" watchObservedRunningTime="2026-03-10 15:18:46.224396505 +0000 UTC m=+790.931211253" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.017946 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dxdms"] Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.019019 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovn-controller" containerID="cri-o://98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71" gracePeriod=30 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.019192 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="sbdb" containerID="cri-o://ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163" gracePeriod=30 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.019269 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8" gracePeriod=30 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.019317 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="northd" containerID="cri-o://d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d" gracePeriod=30 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.019370 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kube-rbac-proxy-node" containerID="cri-o://9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8" gracePeriod=30 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.019397 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovn-acl-logging" containerID="cri-o://06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb" gracePeriod=30 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.021288 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="nbdb" containerID="cri-o://f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea" gracePeriod=30 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.067659 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" containerID="cri-o://f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95" gracePeriod=30 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.212262 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/2.log" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.212964 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/1.log" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.213064 4743 generic.go:334] "Generic (PLEG): container finished" podID="1736aae6-d840-4b31-8c44-6637a05f37ef" containerID="c1aa041b4cfddd3a861e32679ddaa7411c71c86111d77caf2db634f7bcfbc8bb" exitCode=2 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.213181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgbfn" event={"ID":"1736aae6-d840-4b31-8c44-6637a05f37ef","Type":"ContainerDied","Data":"c1aa041b4cfddd3a861e32679ddaa7411c71c86111d77caf2db634f7bcfbc8bb"} Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.213291 4743 scope.go:117] "RemoveContainer" containerID="88086b6c28a891aaa1fbbf913ac7fa1682344f061c747519eccabb7cd99770fe" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.213978 4743 scope.go:117] "RemoveContainer" containerID="c1aa041b4cfddd3a861e32679ddaa7411c71c86111d77caf2db634f7bcfbc8bb" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.214229 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vgbfn_openshift-multus(1736aae6-d840-4b31-8c44-6637a05f37ef)\"" pod="openshift-multus/multus-vgbfn" podUID="1736aae6-d840-4b31-8c44-6637a05f37ef" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.221493 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovnkube-controller/3.log" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.225494 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovn-acl-logging/0.log" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226179 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovn-controller/0.log" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226567 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95" exitCode=0 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95"} Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8"} Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226705 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8" exitCode=0 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226784 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8" exitCode=0 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226845 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb" exitCode=143 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226853 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71" exitCode=143 Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8"} Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb"} Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.226900 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71"} Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.263302 4743 scope.go:117] "RemoveContainer" containerID="7c51084b610b45929b32ddf2306fdb2541af8fc56c0030714457567e46be805d" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.295083 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-65ln8" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.369307 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovn-acl-logging/0.log" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.370559 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovn-controller/0.log" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.371259 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.444725 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hm9mc"] Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445507 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445531 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445548 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="sbdb" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445556 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="sbdb" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445567 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445575 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445591 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445598 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445611 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="nbdb" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445619 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="nbdb" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445628 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="northd" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445638 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="northd" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445652 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445662 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445670 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kube-rbac-proxy-node" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445676 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kube-rbac-proxy-node" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445699 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovn-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445705 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovn-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445714 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kubecfg-setup" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445721 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kubecfg-setup" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.445741 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovn-acl-logging" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.445747 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovn-acl-logging" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446046 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovn-acl-logging" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446071 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446082 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovn-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446099 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="kube-rbac-proxy-node" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446123 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="nbdb" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446131 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446140 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="northd" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446155 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446166 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446182 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446199 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446217 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="sbdb" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.446573 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446588 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: E0310 15:18:51.446607 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.446616 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ad6254-92fa-4092-8b86-2393f317f163" containerName="ovnkube-controller" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.452443 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.453412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-systemd\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.453575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-etc-openvswitch\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.453892 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-script-lib\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.454006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.454165 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-node-log\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.454272 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-ovn\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.454386 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-node-log" (OuterVolumeSpecName: "node-log") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.454381 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455185 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-config\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-slash\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-slash" (OuterVolumeSpecName: "host-slash") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455618 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455633 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-openvswitch\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455898 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-var-lib-cni-networks-ovn-kubernetes\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455942 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-ovn-kubernetes\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456087 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-netd\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456189 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-env-overrides\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.455932 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456211 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-netns\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456519 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-bin\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-systemd-units\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456576 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnddq\" (UniqueName: \"kubernetes.io/projected/91ad6254-92fa-4092-8b86-2393f317f163-kube-api-access-xnddq\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-var-lib-openvswitch\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456691 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91ad6254-92fa-4092-8b86-2393f317f163-ovn-node-metrics-cert\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-log-socket\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456741 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-kubelet\") pod \"91ad6254-92fa-4092-8b86-2393f317f163\" (UID: \"91ad6254-92fa-4092-8b86-2393f317f163\") " Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.456827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457459 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457487 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457500 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457512 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457530 4743 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457541 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457552 4743 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457568 4743 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457579 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91ad6254-92fa-4092-8b86-2393f317f163-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457592 4743 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457606 4743 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457624 4743 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-log-socket" (OuterVolumeSpecName: "log-socket") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457648 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457683 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457687 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.457701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.464657 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ad6254-92fa-4092-8b86-2393f317f163-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.464643 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91ad6254-92fa-4092-8b86-2393f317f163-kube-api-access-xnddq" (OuterVolumeSpecName: "kube-api-access-xnddq") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "kube-api-access-xnddq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.480091 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "91ad6254-92fa-4092-8b86-2393f317f163" (UID: "91ad6254-92fa-4092-8b86-2393f317f163"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.558734 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.558804 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovnkube-script-lib\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.558855 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-var-lib-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.558882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.558909 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-ovn\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.558926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-cni-netd\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-run-netns\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559022 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-node-log\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovn-node-metrics-cert\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559078 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-systemd-units\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-kubelet\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-cni-bin\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-slash\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559158 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbgk9\" (UniqueName: \"kubernetes.io/projected/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-kube-api-access-tbgk9\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559208 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-env-overrides\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovnkube-config\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-systemd\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559273 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-log-socket\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-etc-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559394 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559462 4743 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559488 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnddq\" (UniqueName: \"kubernetes.io/projected/91ad6254-92fa-4092-8b86-2393f317f163-kube-api-access-xnddq\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559510 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559530 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91ad6254-92fa-4092-8b86-2393f317f163-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559549 4743 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559566 4743 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.559585 4743 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91ad6254-92fa-4092-8b86-2393f317f163-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovnkube-script-lib\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660611 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-var-lib-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-cni-netd\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-ovn\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-run-netns\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660747 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-node-log\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovn-node-metrics-cert\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-systemd-units\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660840 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-kubelet\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660863 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-cni-bin\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-slash\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbgk9\" (UniqueName: \"kubernetes.io/projected/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-kube-api-access-tbgk9\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660916 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-env-overrides\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovnkube-config\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-log-socket\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.660986 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-systemd\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.661002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-etc-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.661026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.661126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662015 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-systemd-units\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662038 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-ovn\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662089 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-slash\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-cni-bin\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662063 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-kubelet\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662159 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-run-netns\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-node-log\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovnkube-script-lib\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-cni-netd\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662152 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-var-lib-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662318 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-log-socket\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-run-systemd\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.662571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-etc-openvswitch\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.663118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-env-overrides\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.663450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovnkube-config\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.666510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-ovn-node-metrics-cert\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.681006 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbgk9\" (UniqueName: \"kubernetes.io/projected/ba0802b4-1a3c-4b48-a0bf-d360c6b45e88-kube-api-access-tbgk9\") pod \"ovnkube-node-hm9mc\" (UID: \"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:51 crc kubenswrapper[4743]: I0310 15:18:51.771844 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.235072 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/2.log" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.237091 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba0802b4-1a3c-4b48-a0bf-d360c6b45e88" containerID="0c26df53fe81bfc20bc9e70af4c09f57b483745cecbdf2ce5358be5beca3e4fd" exitCode=0 Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.237159 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerDied","Data":"0c26df53fe81bfc20bc9e70af4c09f57b483745cecbdf2ce5358be5beca3e4fd"} Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.237630 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"a7de3ae9dfe3f32906d94cff7f6cbb070153897ada9b60dad87521caa37bc262"} Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.246910 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovn-acl-logging/0.log" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.247489 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dxdms_91ad6254-92fa-4092-8b86-2393f317f163/ovn-controller/0.log" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249379 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163" exitCode=0 Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249418 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea" exitCode=0 Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249431 4743 generic.go:334] "Generic (PLEG): container finished" podID="91ad6254-92fa-4092-8b86-2393f317f163" containerID="d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d" exitCode=0 Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249444 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163"} Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249497 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea"} Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249515 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d"} Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" event={"ID":"91ad6254-92fa-4092-8b86-2393f317f163","Type":"ContainerDied","Data":"1b3547cbebac76cbd1b7a034ce35c266fc738100f85d153206f34093fae20903"} Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249536 4743 scope.go:117] "RemoveContainer" containerID="f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.249785 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dxdms" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.267229 4743 scope.go:117] "RemoveContainer" containerID="ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.299281 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dxdms"] Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.303729 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dxdms"] Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.314918 4743 scope.go:117] "RemoveContainer" containerID="f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.332467 4743 scope.go:117] "RemoveContainer" containerID="d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.346624 4743 scope.go:117] "RemoveContainer" containerID="124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.365311 4743 scope.go:117] "RemoveContainer" containerID="9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.380008 4743 scope.go:117] "RemoveContainer" containerID="06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.397485 4743 scope.go:117] "RemoveContainer" containerID="98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.421250 4743 scope.go:117] "RemoveContainer" containerID="76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.459534 4743 scope.go:117] "RemoveContainer" containerID="f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.460131 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95\": container with ID starting with f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95 not found: ID does not exist" containerID="f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.460163 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95"} err="failed to get container status \"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95\": rpc error: code = NotFound desc = could not find container \"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95\": container with ID starting with f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.460187 4743 scope.go:117] "RemoveContainer" containerID="ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.460596 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\": container with ID starting with ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163 not found: ID does not exist" containerID="ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.460667 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163"} err="failed to get container status \"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\": rpc error: code = NotFound desc = could not find container \"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\": container with ID starting with ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.460721 4743 scope.go:117] "RemoveContainer" containerID="f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.461185 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\": container with ID starting with f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea not found: ID does not exist" containerID="f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.461233 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea"} err="failed to get container status \"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\": rpc error: code = NotFound desc = could not find container \"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\": container with ID starting with f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.461263 4743 scope.go:117] "RemoveContainer" containerID="d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.461560 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\": container with ID starting with d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d not found: ID does not exist" containerID="d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.461593 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d"} err="failed to get container status \"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\": rpc error: code = NotFound desc = could not find container \"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\": container with ID starting with d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.461614 4743 scope.go:117] "RemoveContainer" containerID="124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.461939 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\": container with ID starting with 124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8 not found: ID does not exist" containerID="124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.461985 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8"} err="failed to get container status \"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\": rpc error: code = NotFound desc = could not find container \"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\": container with ID starting with 124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.462010 4743 scope.go:117] "RemoveContainer" containerID="9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.462619 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\": container with ID starting with 9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8 not found: ID does not exist" containerID="9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.462655 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8"} err="failed to get container status \"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\": rpc error: code = NotFound desc = could not find container \"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\": container with ID starting with 9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.462674 4743 scope.go:117] "RemoveContainer" containerID="06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.463029 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\": container with ID starting with 06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb not found: ID does not exist" containerID="06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.463061 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb"} err="failed to get container status \"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\": rpc error: code = NotFound desc = could not find container \"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\": container with ID starting with 06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.463080 4743 scope.go:117] "RemoveContainer" containerID="98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.463728 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\": container with ID starting with 98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71 not found: ID does not exist" containerID="98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.463777 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71"} err="failed to get container status \"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\": rpc error: code = NotFound desc = could not find container \"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\": container with ID starting with 98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.463804 4743 scope.go:117] "RemoveContainer" containerID="76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8" Mar 10 15:18:52 crc kubenswrapper[4743]: E0310 15:18:52.464224 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\": container with ID starting with 76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8 not found: ID does not exist" containerID="76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.464258 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8"} err="failed to get container status \"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\": rpc error: code = NotFound desc = could not find container \"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\": container with ID starting with 76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.464277 4743 scope.go:117] "RemoveContainer" containerID="f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.464600 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95"} err="failed to get container status \"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95\": rpc error: code = NotFound desc = could not find container \"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95\": container with ID starting with f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.464626 4743 scope.go:117] "RemoveContainer" containerID="ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.464999 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163"} err="failed to get container status \"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\": rpc error: code = NotFound desc = could not find container \"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\": container with ID starting with ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.465029 4743 scope.go:117] "RemoveContainer" containerID="f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.465314 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea"} err="failed to get container status \"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\": rpc error: code = NotFound desc = could not find container \"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\": container with ID starting with f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.465343 4743 scope.go:117] "RemoveContainer" containerID="d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.465637 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d"} err="failed to get container status \"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\": rpc error: code = NotFound desc = could not find container \"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\": container with ID starting with d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.465662 4743 scope.go:117] "RemoveContainer" containerID="124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.465972 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8"} err="failed to get container status \"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\": rpc error: code = NotFound desc = could not find container \"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\": container with ID starting with 124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.465994 4743 scope.go:117] "RemoveContainer" containerID="9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.466265 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8"} err="failed to get container status \"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\": rpc error: code = NotFound desc = could not find container \"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\": container with ID starting with 9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.466286 4743 scope.go:117] "RemoveContainer" containerID="06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.466752 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb"} err="failed to get container status \"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\": rpc error: code = NotFound desc = could not find container \"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\": container with ID starting with 06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.466783 4743 scope.go:117] "RemoveContainer" containerID="98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.467090 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71"} err="failed to get container status \"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\": rpc error: code = NotFound desc = could not find container \"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\": container with ID starting with 98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.467115 4743 scope.go:117] "RemoveContainer" containerID="76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.467406 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8"} err="failed to get container status \"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\": rpc error: code = NotFound desc = could not find container \"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\": container with ID starting with 76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.467428 4743 scope.go:117] "RemoveContainer" containerID="f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.467968 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95"} err="failed to get container status \"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95\": rpc error: code = NotFound desc = could not find container \"f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95\": container with ID starting with f47d5a4f85448aef3f9df15e91f43a74205c6d5de76921b8486fa668a9359a95 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.468009 4743 scope.go:117] "RemoveContainer" containerID="ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.468548 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163"} err="failed to get container status \"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\": rpc error: code = NotFound desc = could not find container \"ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163\": container with ID starting with ea42298fa02c503743c45767f07a4ab4cd657c1d68398d5249b8253e71873163 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.468572 4743 scope.go:117] "RemoveContainer" containerID="f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.468941 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea"} err="failed to get container status \"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\": rpc error: code = NotFound desc = could not find container \"f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea\": container with ID starting with f5b6900b999df2476b369a9810be22c9c810bfc86c5b1d1445447cba88e0abea not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.468962 4743 scope.go:117] "RemoveContainer" containerID="d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.469294 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d"} err="failed to get container status \"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\": rpc error: code = NotFound desc = could not find container \"d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d\": container with ID starting with d59a4cd361ebb2d121e3917cf8e2d5f5eeafbaf47be1a7854c0c5b759920ff3d not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.469313 4743 scope.go:117] "RemoveContainer" containerID="124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.469572 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8"} err="failed to get container status \"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\": rpc error: code = NotFound desc = could not find container \"124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8\": container with ID starting with 124c6ac38889239b8cef414750bd5cd2c3f4f9d3b889092dda72328cbf4b31f8 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.469590 4743 scope.go:117] "RemoveContainer" containerID="9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.470146 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8"} err="failed to get container status \"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\": rpc error: code = NotFound desc = could not find container \"9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8\": container with ID starting with 9a0f2d15fc49529b1acc2c4f684c6ae7e31290468d519ad2ec1456b092b494c8 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.470198 4743 scope.go:117] "RemoveContainer" containerID="06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.470688 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb"} err="failed to get container status \"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\": rpc error: code = NotFound desc = could not find container \"06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb\": container with ID starting with 06e4739faa428b8a49969b1873d6448b832a2e37a5761bcd85a2f1e2d689befb not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.470712 4743 scope.go:117] "RemoveContainer" containerID="98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.471327 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71"} err="failed to get container status \"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\": rpc error: code = NotFound desc = could not find container \"98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71\": container with ID starting with 98bbbf7f7c6e2eeaf660ae2f5753b6860eec7a4f89648f6c0234beb3d0de7e71 not found: ID does not exist" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.471367 4743 scope.go:117] "RemoveContainer" containerID="76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8" Mar 10 15:18:52 crc kubenswrapper[4743]: I0310 15:18:52.471747 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8"} err="failed to get container status \"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\": rpc error: code = NotFound desc = could not find container \"76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8\": container with ID starting with 76274b97a4d943e944f1b540984ee0ea829891fda303c2bc73adfb935baa1be8 not found: ID does not exist" Mar 10 15:18:53 crc kubenswrapper[4743]: I0310 15:18:53.262286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"e4a3c7c90e796d1a7de60e33a761ce26c2fbaae60d02c739072be49ca8055f8e"} Mar 10 15:18:53 crc kubenswrapper[4743]: I0310 15:18:53.262345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"b804f41911d289b68e38c8845282ffcdc5517d248db1cfc62ae509ebba3a9b90"} Mar 10 15:18:53 crc kubenswrapper[4743]: I0310 15:18:53.262359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"89a8628f377e146959a9fad4c881caf1330ab2b10ec424fdfda76dc01824f55b"} Mar 10 15:18:53 crc kubenswrapper[4743]: I0310 15:18:53.262371 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"cb874568926cc3be4c8175c0d63dd6fac28c4bab71e4dcc21a42fdfa8619c797"} Mar 10 15:18:53 crc kubenswrapper[4743]: I0310 15:18:53.262386 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"6a6ff6fa50daf163b80fbceb6551c491e76f0c0823d73a0fc7a801cabeb2a15a"} Mar 10 15:18:53 crc kubenswrapper[4743]: I0310 15:18:53.262397 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"58cd99aa23a3f215ccda1834c3f57451ddc32482228c1e59dec2c22f67690bff"} Mar 10 15:18:53 crc kubenswrapper[4743]: I0310 15:18:53.935130 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91ad6254-92fa-4092-8b86-2393f317f163" path="/var/lib/kubelet/pods/91ad6254-92fa-4092-8b86-2393f317f163/volumes" Mar 10 15:18:55 crc kubenswrapper[4743]: I0310 15:18:55.282096 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"6cac92f0f40b4befd4ef7888591916b9870571d2ff23c6ea0f37b07690365eea"} Mar 10 15:18:58 crc kubenswrapper[4743]: I0310 15:18:58.316780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" event={"ID":"ba0802b4-1a3c-4b48-a0bf-d360c6b45e88","Type":"ContainerStarted","Data":"6edd2bf8ab8e4ac22257ea1d3f60f8ce989477a0a10b1ed260c0acbaef6b06a6"} Mar 10 15:18:58 crc kubenswrapper[4743]: I0310 15:18:58.317914 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:58 crc kubenswrapper[4743]: I0310 15:18:58.317945 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:58 crc kubenswrapper[4743]: I0310 15:18:58.317953 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:58 crc kubenswrapper[4743]: I0310 15:18:58.344619 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:18:58 crc kubenswrapper[4743]: I0310 15:18:58.346240 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" podStartSLOduration=7.346225015 podStartE2EDuration="7.346225015s" podCreationTimestamp="2026-03-10 15:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:18:58.345073882 +0000 UTC m=+803.051888640" watchObservedRunningTime="2026-03-10 15:18:58.346225015 +0000 UTC m=+803.053039763" Mar 10 15:18:58 crc kubenswrapper[4743]: I0310 15:18:58.360109 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:19:05 crc kubenswrapper[4743]: I0310 15:19:05.922319 4743 scope.go:117] "RemoveContainer" containerID="c1aa041b4cfddd3a861e32679ddaa7411c71c86111d77caf2db634f7bcfbc8bb" Mar 10 15:19:05 crc kubenswrapper[4743]: E0310 15:19:05.923170 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vgbfn_openshift-multus(1736aae6-d840-4b31-8c44-6637a05f37ef)\"" pod="openshift-multus/multus-vgbfn" podUID="1736aae6-d840-4b31-8c44-6637a05f37ef" Mar 10 15:19:11 crc kubenswrapper[4743]: I0310 15:19:11.253409 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:19:11 crc kubenswrapper[4743]: I0310 15:19:11.254071 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:19:11 crc kubenswrapper[4743]: I0310 15:19:11.964847 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Mar 10 15:19:11 crc kubenswrapper[4743]: I0310 15:19:11.966360 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Mar 10 15:19:11 crc kubenswrapper[4743]: I0310 15:19:11.968863 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 15:19:11 crc kubenswrapper[4743]: I0310 15:19:11.970019 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nzxgm" Mar 10 15:19:11 crc kubenswrapper[4743]: I0310 15:19:11.970359 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.065696 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-log\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.065804 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-data\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.065869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkh8f\" (UniqueName: \"kubernetes.io/projected/203a508a-4041-40ea-9a1b-bb6a706d9339-kube-api-access-qkh8f\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.066117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-run\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.167737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-data\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.167833 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkh8f\" (UniqueName: \"kubernetes.io/projected/203a508a-4041-40ea-9a1b-bb6a706d9339-kube-api-access-qkh8f\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.167879 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-run\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.168011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-log\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.168461 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-data\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.168547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-log\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.168779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/203a508a-4041-40ea-9a1b-bb6a706d9339-run\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.204930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkh8f\" (UniqueName: \"kubernetes.io/projected/203a508a-4041-40ea-9a1b-bb6a706d9339-kube-api-access-qkh8f\") pod \"ceph\" (UID: \"203a508a-4041-40ea-9a1b-bb6a706d9339\") " pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.288321 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Mar 10 15:19:12 crc kubenswrapper[4743]: W0310 15:19:12.312510 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod203a508a_4041_40ea_9a1b_bb6a706d9339.slice/crio-274d819629a190c540dfd68ef55c123105275d48c77e16602f91e01b567b12a7 WatchSource:0}: Error finding container 274d819629a190c540dfd68ef55c123105275d48c77e16602f91e01b567b12a7: Status 404 returned error can't find the container with id 274d819629a190c540dfd68ef55c123105275d48c77e16602f91e01b567b12a7 Mar 10 15:19:12 crc kubenswrapper[4743]: E0310 15:19:12.332041 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:12 crc kubenswrapper[4743]: E0310 15:19:12.348273 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:12 crc kubenswrapper[4743]: I0310 15:19:12.412576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"203a508a-4041-40ea-9a1b-bb6a706d9339","Type":"ContainerStarted","Data":"274d819629a190c540dfd68ef55c123105275d48c77e16602f91e01b567b12a7"} Mar 10 15:19:13 crc kubenswrapper[4743]: E0310 15:19:13.464895 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:13 crc kubenswrapper[4743]: E0310 15:19:13.480054 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:14 crc kubenswrapper[4743]: E0310 15:19:14.624474 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:14 crc kubenswrapper[4743]: E0310 15:19:14.644501 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:15 crc kubenswrapper[4743]: E0310 15:19:15.834481 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:15 crc kubenswrapper[4743]: E0310 15:19:15.848674 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:17 crc kubenswrapper[4743]: E0310 15:19:17.010738 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:17 crc kubenswrapper[4743]: E0310 15:19:17.026406 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:17 crc kubenswrapper[4743]: I0310 15:19:17.915975 4743 scope.go:117] "RemoveContainer" containerID="c1aa041b4cfddd3a861e32679ddaa7411c71c86111d77caf2db634f7bcfbc8bb" Mar 10 15:19:18 crc kubenswrapper[4743]: E0310 15:19:18.166998 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:18 crc kubenswrapper[4743]: E0310 15:19:18.185434 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:18 crc kubenswrapper[4743]: I0310 15:19:18.447652 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgbfn_1736aae6-d840-4b31-8c44-6637a05f37ef/kube-multus/2.log" Mar 10 15:19:18 crc kubenswrapper[4743]: I0310 15:19:18.447713 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgbfn" event={"ID":"1736aae6-d840-4b31-8c44-6637a05f37ef","Type":"ContainerStarted","Data":"44717ee3f5cf6a32518cd0d60ee604baa3d93e0555a67ae7ae68ed6b457f7980"} Mar 10 15:19:19 crc kubenswrapper[4743]: E0310 15:19:19.348051 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:19 crc kubenswrapper[4743]: E0310 15:19:19.362290 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:20 crc kubenswrapper[4743]: E0310 15:19:20.532458 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:20 crc kubenswrapper[4743]: E0310 15:19:20.547896 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:21 crc kubenswrapper[4743]: E0310 15:19:21.675143 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:21 crc kubenswrapper[4743]: E0310 15:19:21.689663 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:21 crc kubenswrapper[4743]: I0310 15:19:21.795293 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm9mc" Mar 10 15:19:22 crc kubenswrapper[4743]: E0310 15:19:22.829517 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:22 crc kubenswrapper[4743]: E0310 15:19:22.848525 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:23 crc kubenswrapper[4743]: E0310 15:19:23.979369 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:23 crc kubenswrapper[4743]: E0310 15:19:23.995735 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:25 crc kubenswrapper[4743]: E0310 15:19:25.124911 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:25 crc kubenswrapper[4743]: E0310 15:19:25.142924 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:26 crc kubenswrapper[4743]: E0310 15:19:26.269387 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:26 crc kubenswrapper[4743]: E0310 15:19:26.284520 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:27 crc kubenswrapper[4743]: E0310 15:19:27.425582 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:27 crc kubenswrapper[4743]: E0310 15:19:27.439650 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:28 crc kubenswrapper[4743]: E0310 15:19:28.599109 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:28 crc kubenswrapper[4743]: E0310 15:19:28.614932 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:29 crc kubenswrapper[4743]: E0310 15:19:29.752884 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:29 crc kubenswrapper[4743]: E0310 15:19:29.773443 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:30 crc kubenswrapper[4743]: E0310 15:19:30.935271 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:30 crc kubenswrapper[4743]: E0310 15:19:30.952450 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:31 crc kubenswrapper[4743]: E0310 15:19:31.233373 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/ceph/demo:latest-squid" Mar 10 15:19:31 crc kubenswrapper[4743]: E0310 15:19:31.233564 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceph,Image:quay.io/ceph/demo:latest-squid,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:MON_IP,Value:192.168.126.11,ValueFrom:nil,},EnvVar{Name:CEPH_DAEMON,Value:demo,ValueFrom:nil,},EnvVar{Name:CEPH_PUBLIC_NETWORK,Value:0.0.0.0/0,ValueFrom:nil,},EnvVar{Name:DEMO_DAEMONS,Value:osd,mds,rgw,ValueFrom:nil,},EnvVar{Name:CEPH_DEMO_UID,Value:0,ValueFrom:nil,},EnvVar{Name:RGW_NAME,Value:ceph,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkh8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceph_openstack(203a508a-4041-40ea-9a1b-bb6a706d9339): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:19:31 crc kubenswrapper[4743]: E0310 15:19:31.234794 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceph" podUID="203a508a-4041-40ea-9a1b-bb6a706d9339" Mar 10 15:19:31 crc kubenswrapper[4743]: E0310 15:19:31.523099 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/ceph/demo:latest-squid\\\"\"" pod="openstack/ceph" podUID="203a508a-4041-40ea-9a1b-bb6a706d9339" Mar 10 15:19:32 crc kubenswrapper[4743]: E0310 15:19:32.127388 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:32 crc kubenswrapper[4743]: E0310 15:19:32.148332 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:33 crc kubenswrapper[4743]: E0310 15:19:33.357371 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:33 crc kubenswrapper[4743]: E0310 15:19:33.411721 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:34 crc kubenswrapper[4743]: E0310 15:19:34.559374 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:34 crc kubenswrapper[4743]: E0310 15:19:34.578805 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:35 crc kubenswrapper[4743]: E0310 15:19:35.764205 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:35 crc kubenswrapper[4743]: E0310 15:19:35.786692 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:36 crc kubenswrapper[4743]: E0310 15:19:36.926939 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:36 crc kubenswrapper[4743]: E0310 15:19:36.944191 4743 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=2081946086216162561, SKID=, AKID=81:58:E6:6C:5C:1A:F9:2A:69:36:BC:10:49:63:42:60:5E:06:F5:3C failed: x509: certificate signed by unknown authority" Mar 10 15:19:37 crc kubenswrapper[4743]: I0310 15:19:37.091726 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 15:19:41 crc kubenswrapper[4743]: I0310 15:19:41.252673 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:19:41 crc kubenswrapper[4743]: I0310 15:19:41.252905 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:19:46 crc kubenswrapper[4743]: I0310 15:19:46.622292 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"203a508a-4041-40ea-9a1b-bb6a706d9339","Type":"ContainerStarted","Data":"0f4d3bf792a23826e66d344824a80a54ff052434f10831fe9158e2720c583691"} Mar 10 15:19:46 crc kubenswrapper[4743]: I0310 15:19:46.646626 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=2.455259928 podStartE2EDuration="35.646601827s" podCreationTimestamp="2026-03-10 15:19:11 +0000 UTC" firstStartedPulling="2026-03-10 15:19:12.314573859 +0000 UTC m=+817.021388607" lastFinishedPulling="2026-03-10 15:19:45.505915748 +0000 UTC m=+850.212730506" observedRunningTime="2026-03-10 15:19:46.639296215 +0000 UTC m=+851.346110963" watchObservedRunningTime="2026-03-10 15:19:46.646601827 +0000 UTC m=+851.353416585" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.136559 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552600-lknsq"] Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.140059 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-lknsq" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.142511 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.145089 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-lknsq"] Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.145331 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.145432 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.233483 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27f6l\" (UniqueName: \"kubernetes.io/projected/45e27721-5b31-46bb-9514-0b3691820891-kube-api-access-27f6l\") pod \"auto-csr-approver-29552600-lknsq\" (UID: \"45e27721-5b31-46bb-9514-0b3691820891\") " pod="openshift-infra/auto-csr-approver-29552600-lknsq" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.335567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27f6l\" (UniqueName: \"kubernetes.io/projected/45e27721-5b31-46bb-9514-0b3691820891-kube-api-access-27f6l\") pod \"auto-csr-approver-29552600-lknsq\" (UID: \"45e27721-5b31-46bb-9514-0b3691820891\") " pod="openshift-infra/auto-csr-approver-29552600-lknsq" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.364024 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27f6l\" (UniqueName: \"kubernetes.io/projected/45e27721-5b31-46bb-9514-0b3691820891-kube-api-access-27f6l\") pod \"auto-csr-approver-29552600-lknsq\" (UID: \"45e27721-5b31-46bb-9514-0b3691820891\") " pod="openshift-infra/auto-csr-approver-29552600-lknsq" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.458911 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-lknsq" Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.666196 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-lknsq"] Mar 10 15:20:00 crc kubenswrapper[4743]: W0310 15:20:00.672273 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e27721_5b31_46bb_9514_0b3691820891.slice/crio-9d682ca1b2b7b3bb8e1f5cfc65ce232b5ecdca0e3f0989bf883a2c824c1edec8 WatchSource:0}: Error finding container 9d682ca1b2b7b3bb8e1f5cfc65ce232b5ecdca0e3f0989bf883a2c824c1edec8: Status 404 returned error can't find the container with id 9d682ca1b2b7b3bb8e1f5cfc65ce232b5ecdca0e3f0989bf883a2c824c1edec8 Mar 10 15:20:00 crc kubenswrapper[4743]: I0310 15:20:00.710951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-lknsq" event={"ID":"45e27721-5b31-46bb-9514-0b3691820891","Type":"ContainerStarted","Data":"9d682ca1b2b7b3bb8e1f5cfc65ce232b5ecdca0e3f0989bf883a2c824c1edec8"} Mar 10 15:20:02 crc kubenswrapper[4743]: I0310 15:20:02.725688 4743 generic.go:334] "Generic (PLEG): container finished" podID="45e27721-5b31-46bb-9514-0b3691820891" containerID="1a5e0b52b7d3edafb69cba70213fad247de19d0a18c942cbeeaef768a0c4c220" exitCode=0 Mar 10 15:20:02 crc kubenswrapper[4743]: I0310 15:20:02.725802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-lknsq" event={"ID":"45e27721-5b31-46bb-9514-0b3691820891","Type":"ContainerDied","Data":"1a5e0b52b7d3edafb69cba70213fad247de19d0a18c942cbeeaef768a0c4c220"} Mar 10 15:20:03 crc kubenswrapper[4743]: I0310 15:20:03.982899 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-lknsq" Mar 10 15:20:04 crc kubenswrapper[4743]: I0310 15:20:04.092246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27f6l\" (UniqueName: \"kubernetes.io/projected/45e27721-5b31-46bb-9514-0b3691820891-kube-api-access-27f6l\") pod \"45e27721-5b31-46bb-9514-0b3691820891\" (UID: \"45e27721-5b31-46bb-9514-0b3691820891\") " Mar 10 15:20:04 crc kubenswrapper[4743]: I0310 15:20:04.099294 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e27721-5b31-46bb-9514-0b3691820891-kube-api-access-27f6l" (OuterVolumeSpecName: "kube-api-access-27f6l") pod "45e27721-5b31-46bb-9514-0b3691820891" (UID: "45e27721-5b31-46bb-9514-0b3691820891"). InnerVolumeSpecName "kube-api-access-27f6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:20:04 crc kubenswrapper[4743]: I0310 15:20:04.194552 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27f6l\" (UniqueName: \"kubernetes.io/projected/45e27721-5b31-46bb-9514-0b3691820891-kube-api-access-27f6l\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:04 crc kubenswrapper[4743]: I0310 15:20:04.743381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-lknsq" event={"ID":"45e27721-5b31-46bb-9514-0b3691820891","Type":"ContainerDied","Data":"9d682ca1b2b7b3bb8e1f5cfc65ce232b5ecdca0e3f0989bf883a2c824c1edec8"} Mar 10 15:20:04 crc kubenswrapper[4743]: I0310 15:20:04.743449 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d682ca1b2b7b3bb8e1f5cfc65ce232b5ecdca0e3f0989bf883a2c824c1edec8" Mar 10 15:20:04 crc kubenswrapper[4743]: I0310 15:20:04.743450 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-lknsq" Mar 10 15:20:05 crc kubenswrapper[4743]: I0310 15:20:05.053470 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-g99n7"] Mar 10 15:20:05 crc kubenswrapper[4743]: I0310 15:20:05.058956 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-g99n7"] Mar 10 15:20:05 crc kubenswrapper[4743]: I0310 15:20:05.924140 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="665e9938-cfcb-45ff-8bb8-158fb0c7d2d9" path="/var/lib/kubelet/pods/665e9938-cfcb-45ff-8bb8-158fb0c7d2d9/volumes" Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.252726 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.253377 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.253444 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.255086 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97d590bdb55b8d3eeaf67e203c3704815de1593197109afcafe553653c1d6c9f"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.255180 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://97d590bdb55b8d3eeaf67e203c3704815de1593197109afcafe553653c1d6c9f" gracePeriod=600 Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.809064 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="97d590bdb55b8d3eeaf67e203c3704815de1593197109afcafe553653c1d6c9f" exitCode=0 Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.809166 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"97d590bdb55b8d3eeaf67e203c3704815de1593197109afcafe553653c1d6c9f"} Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.809504 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"9f3c2bb9de0715122300a23c69b01248c82e5f8f70b4cad7c24ef747b724b4d8"} Mar 10 15:20:11 crc kubenswrapper[4743]: I0310 15:20:11.809565 4743 scope.go:117] "RemoveContainer" containerID="485a71da308cafc31c75bf433fce757dfcdc428af1dc1ad1dec2295c6a5710ee" Mar 10 15:20:19 crc kubenswrapper[4743]: E0310 15:20:19.993010 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.115:38770->38.102.83.115:40715: write tcp 38.102.83.115:38770->38.102.83.115:40715: write: broken pipe Mar 10 15:20:36 crc kubenswrapper[4743]: I0310 15:20:36.509336 4743 scope.go:117] "RemoveContainer" containerID="781be795c7e4e4a724afed1359bbfca1a4b78fdb535b5330a50853c2c57638f4" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.635299 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk"] Mar 10 15:20:58 crc kubenswrapper[4743]: E0310 15:20:58.636426 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e27721-5b31-46bb-9514-0b3691820891" containerName="oc" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.636443 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e27721-5b31-46bb-9514-0b3691820891" containerName="oc" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.636536 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e27721-5b31-46bb-9514-0b3691820891" containerName="oc" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.637392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.639664 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.649362 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk"] Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.799577 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.799669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.800102 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mcl\" (UniqueName: \"kubernetes.io/projected/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-kube-api-access-p7mcl\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.902494 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mcl\" (UniqueName: \"kubernetes.io/projected/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-kube-api-access-p7mcl\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.902623 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.902671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.903736 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.903778 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.938424 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mcl\" (UniqueName: \"kubernetes.io/projected/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-kube-api-access-p7mcl\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:58 crc kubenswrapper[4743]: I0310 15:20:58.957353 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:20:59 crc kubenswrapper[4743]: I0310 15:20:59.406217 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk"] Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.168647 4743 generic.go:334] "Generic (PLEG): container finished" podID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerID="002aec2c229b2cd3b105161ea60896e60f12beac1ca07a5812a5de40eb038277" exitCode=0 Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.168774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" event={"ID":"e3871ddd-8c74-4c7f-a368-75bf19bdd67a","Type":"ContainerDied","Data":"002aec2c229b2cd3b105161ea60896e60f12beac1ca07a5812a5de40eb038277"} Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.169132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" event={"ID":"e3871ddd-8c74-4c7f-a368-75bf19bdd67a","Type":"ContainerStarted","Data":"2e9477ccf77c3f1d420e13d518ff9f7992daf19e4cfd73d2657973801b8e87b0"} Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.734455 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d7jlh"] Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.745118 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d7jlh"] Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.745189 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.936674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kp8\" (UniqueName: \"kubernetes.io/projected/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-kube-api-access-l2kp8\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.936795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-utilities\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:00 crc kubenswrapper[4743]: I0310 15:21:00.936853 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-catalog-content\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:01 crc kubenswrapper[4743]: I0310 15:21:01.038111 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kp8\" (UniqueName: \"kubernetes.io/projected/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-kube-api-access-l2kp8\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:01 crc kubenswrapper[4743]: I0310 15:21:01.038178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-utilities\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:01 crc kubenswrapper[4743]: I0310 15:21:01.038232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-catalog-content\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:01 crc kubenswrapper[4743]: I0310 15:21:01.039184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-catalog-content\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:01 crc kubenswrapper[4743]: I0310 15:21:01.039307 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-utilities\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:01 crc kubenswrapper[4743]: I0310 15:21:01.062657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kp8\" (UniqueName: \"kubernetes.io/projected/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-kube-api-access-l2kp8\") pod \"redhat-operators-d7jlh\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:01 crc kubenswrapper[4743]: I0310 15:21:01.071289 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:01 crc kubenswrapper[4743]: I0310 15:21:01.293187 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d7jlh"] Mar 10 15:21:02 crc kubenswrapper[4743]: I0310 15:21:02.192436 4743 generic.go:334] "Generic (PLEG): container finished" podID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerID="14c57ae6bee4ef1182c5071776b90dc5ff2f71c7b407b7fd7ceb721497cf359f" exitCode=0 Mar 10 15:21:02 crc kubenswrapper[4743]: I0310 15:21:02.192570 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" event={"ID":"e3871ddd-8c74-4c7f-a368-75bf19bdd67a","Type":"ContainerDied","Data":"14c57ae6bee4ef1182c5071776b90dc5ff2f71c7b407b7fd7ceb721497cf359f"} Mar 10 15:21:02 crc kubenswrapper[4743]: I0310 15:21:02.195152 4743 generic.go:334] "Generic (PLEG): container finished" podID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerID="39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750" exitCode=0 Mar 10 15:21:02 crc kubenswrapper[4743]: I0310 15:21:02.195212 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7jlh" event={"ID":"6f3bbc86-5220-42ca-99a1-c02f2cc758c7","Type":"ContainerDied","Data":"39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750"} Mar 10 15:21:02 crc kubenswrapper[4743]: I0310 15:21:02.195254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7jlh" event={"ID":"6f3bbc86-5220-42ca-99a1-c02f2cc758c7","Type":"ContainerStarted","Data":"6c6ac52799240a196247e10b73fddf85028dbf03fb7aab18574e07a9ac1701c7"} Mar 10 15:21:02 crc kubenswrapper[4743]: I0310 15:21:02.196934 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:21:03 crc kubenswrapper[4743]: I0310 15:21:03.206895 4743 generic.go:334] "Generic (PLEG): container finished" podID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerID="0ef6972f06ae9dae197684446c2ef1123af6db8499bf14cf26083fb561bff5de" exitCode=0 Mar 10 15:21:03 crc kubenswrapper[4743]: I0310 15:21:03.206964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" event={"ID":"e3871ddd-8c74-4c7f-a368-75bf19bdd67a","Type":"ContainerDied","Data":"0ef6972f06ae9dae197684446c2ef1123af6db8499bf14cf26083fb561bff5de"} Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.214518 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7jlh" event={"ID":"6f3bbc86-5220-42ca-99a1-c02f2cc758c7","Type":"ContainerStarted","Data":"6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07"} Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.535184 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.600142 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7mcl\" (UniqueName: \"kubernetes.io/projected/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-kube-api-access-p7mcl\") pod \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.600231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-util\") pod \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.600285 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-bundle\") pod \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\" (UID: \"e3871ddd-8c74-4c7f-a368-75bf19bdd67a\") " Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.601292 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-bundle" (OuterVolumeSpecName: "bundle") pod "e3871ddd-8c74-4c7f-a368-75bf19bdd67a" (UID: "e3871ddd-8c74-4c7f-a368-75bf19bdd67a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.608136 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-kube-api-access-p7mcl" (OuterVolumeSpecName: "kube-api-access-p7mcl") pod "e3871ddd-8c74-4c7f-a368-75bf19bdd67a" (UID: "e3871ddd-8c74-4c7f-a368-75bf19bdd67a"). InnerVolumeSpecName "kube-api-access-p7mcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.614102 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-util" (OuterVolumeSpecName: "util") pod "e3871ddd-8c74-4c7f-a368-75bf19bdd67a" (UID: "e3871ddd-8c74-4c7f-a368-75bf19bdd67a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.701104 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7mcl\" (UniqueName: \"kubernetes.io/projected/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-kube-api-access-p7mcl\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.701153 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-util\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:04 crc kubenswrapper[4743]: I0310 15:21:04.701163 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3871ddd-8c74-4c7f-a368-75bf19bdd67a-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:05 crc kubenswrapper[4743]: I0310 15:21:05.227590 4743 generic.go:334] "Generic (PLEG): container finished" podID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerID="6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07" exitCode=0 Mar 10 15:21:05 crc kubenswrapper[4743]: I0310 15:21:05.227731 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7jlh" event={"ID":"6f3bbc86-5220-42ca-99a1-c02f2cc758c7","Type":"ContainerDied","Data":"6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07"} Mar 10 15:21:05 crc kubenswrapper[4743]: I0310 15:21:05.234123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" event={"ID":"e3871ddd-8c74-4c7f-a368-75bf19bdd67a","Type":"ContainerDied","Data":"2e9477ccf77c3f1d420e13d518ff9f7992daf19e4cfd73d2657973801b8e87b0"} Mar 10 15:21:05 crc kubenswrapper[4743]: I0310 15:21:05.234184 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e9477ccf77c3f1d420e13d518ff9f7992daf19e4cfd73d2657973801b8e87b0" Mar 10 15:21:05 crc kubenswrapper[4743]: I0310 15:21:05.234213 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk" Mar 10 15:21:06 crc kubenswrapper[4743]: I0310 15:21:06.253358 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7jlh" event={"ID":"6f3bbc86-5220-42ca-99a1-c02f2cc758c7","Type":"ContainerStarted","Data":"8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c"} Mar 10 15:21:06 crc kubenswrapper[4743]: I0310 15:21:06.273551 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d7jlh" podStartSLOduration=2.553973183 podStartE2EDuration="6.273531055s" podCreationTimestamp="2026-03-10 15:21:00 +0000 UTC" firstStartedPulling="2026-03-10 15:21:02.19657625 +0000 UTC m=+926.903390998" lastFinishedPulling="2026-03-10 15:21:05.916134092 +0000 UTC m=+930.622948870" observedRunningTime="2026-03-10 15:21:06.271837666 +0000 UTC m=+930.978652424" watchObservedRunningTime="2026-03-10 15:21:06.273531055 +0000 UTC m=+930.980345803" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.069289 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d"] Mar 10 15:21:10 crc kubenswrapper[4743]: E0310 15:21:10.069867 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerName="util" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.069885 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerName="util" Mar 10 15:21:10 crc kubenswrapper[4743]: E0310 15:21:10.069903 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerName="extract" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.069910 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerName="extract" Mar 10 15:21:10 crc kubenswrapper[4743]: E0310 15:21:10.069920 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerName="pull" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.069927 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerName="pull" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.070049 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3871ddd-8c74-4c7f-a368-75bf19bdd67a" containerName="extract" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.070494 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.072634 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-4xr6d" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.073023 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.074270 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.082382 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5gdk\" (UniqueName: \"kubernetes.io/projected/c663bb83-e937-4db5-94e3-87f5253b27c9-kube-api-access-q5gdk\") pod \"nmstate-operator-75c5dccd6c-66r4d\" (UID: \"c663bb83-e937-4db5-94e3-87f5253b27c9\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.087175 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d"] Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.183166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5gdk\" (UniqueName: \"kubernetes.io/projected/c663bb83-e937-4db5-94e3-87f5253b27c9-kube-api-access-q5gdk\") pod \"nmstate-operator-75c5dccd6c-66r4d\" (UID: \"c663bb83-e937-4db5-94e3-87f5253b27c9\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.214224 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5gdk\" (UniqueName: \"kubernetes.io/projected/c663bb83-e937-4db5-94e3-87f5253b27c9-kube-api-access-q5gdk\") pod \"nmstate-operator-75c5dccd6c-66r4d\" (UID: \"c663bb83-e937-4db5-94e3-87f5253b27c9\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.313785 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ktv9r"] Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.315360 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.329609 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktv9r"] Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.385387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-utilities\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.385444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-catalog-content\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.385502 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zp7\" (UniqueName: \"kubernetes.io/projected/175c524a-33de-450a-b171-89128b525741-kube-api-access-94zp7\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.386083 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.487388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zp7\" (UniqueName: \"kubernetes.io/projected/175c524a-33de-450a-b171-89128b525741-kube-api-access-94zp7\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.488344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-utilities\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.488396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-catalog-content\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.489144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-catalog-content\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.489246 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-utilities\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.510999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zp7\" (UniqueName: \"kubernetes.io/projected/175c524a-33de-450a-b171-89128b525741-kube-api-access-94zp7\") pod \"redhat-marketplace-ktv9r\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.596215 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d"] Mar 10 15:21:10 crc kubenswrapper[4743]: W0310 15:21:10.607228 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc663bb83_e937_4db5_94e3_87f5253b27c9.slice/crio-f60e64ce7b0ed12b1b17d858753751a4670e5a95476ed751db47977f566e4c3e WatchSource:0}: Error finding container f60e64ce7b0ed12b1b17d858753751a4670e5a95476ed751db47977f566e4c3e: Status 404 returned error can't find the container with id f60e64ce7b0ed12b1b17d858753751a4670e5a95476ed751db47977f566e4c3e Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.630649 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:10 crc kubenswrapper[4743]: W0310 15:21:10.883837 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175c524a_33de_450a_b171_89128b525741.slice/crio-ad4770b6a592176c6e4c49bf0775692c60e698165d48a26d7e336bd762af7f3c WatchSource:0}: Error finding container ad4770b6a592176c6e4c49bf0775692c60e698165d48a26d7e336bd762af7f3c: Status 404 returned error can't find the container with id ad4770b6a592176c6e4c49bf0775692c60e698165d48a26d7e336bd762af7f3c Mar 10 15:21:10 crc kubenswrapper[4743]: I0310 15:21:10.886716 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktv9r"] Mar 10 15:21:11 crc kubenswrapper[4743]: I0310 15:21:11.072434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:11 crc kubenswrapper[4743]: I0310 15:21:11.072900 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:11 crc kubenswrapper[4743]: I0310 15:21:11.284351 4743 generic.go:334] "Generic (PLEG): container finished" podID="175c524a-33de-450a-b171-89128b525741" containerID="4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc" exitCode=0 Mar 10 15:21:11 crc kubenswrapper[4743]: I0310 15:21:11.284427 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktv9r" event={"ID":"175c524a-33de-450a-b171-89128b525741","Type":"ContainerDied","Data":"4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc"} Mar 10 15:21:11 crc kubenswrapper[4743]: I0310 15:21:11.284457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktv9r" event={"ID":"175c524a-33de-450a-b171-89128b525741","Type":"ContainerStarted","Data":"ad4770b6a592176c6e4c49bf0775692c60e698165d48a26d7e336bd762af7f3c"} Mar 10 15:21:11 crc kubenswrapper[4743]: I0310 15:21:11.286058 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d" event={"ID":"c663bb83-e937-4db5-94e3-87f5253b27c9","Type":"ContainerStarted","Data":"f60e64ce7b0ed12b1b17d858753751a4670e5a95476ed751db47977f566e4c3e"} Mar 10 15:21:12 crc kubenswrapper[4743]: I0310 15:21:12.118171 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d7jlh" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="registry-server" probeResult="failure" output=< Mar 10 15:21:12 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 15:21:12 crc kubenswrapper[4743]: > Mar 10 15:21:12 crc kubenswrapper[4743]: I0310 15:21:12.294913 4743 generic.go:334] "Generic (PLEG): container finished" podID="175c524a-33de-450a-b171-89128b525741" containerID="d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450" exitCode=0 Mar 10 15:21:12 crc kubenswrapper[4743]: I0310 15:21:12.294968 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktv9r" event={"ID":"175c524a-33de-450a-b171-89128b525741","Type":"ContainerDied","Data":"d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450"} Mar 10 15:21:13 crc kubenswrapper[4743]: I0310 15:21:13.303770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d" event={"ID":"c663bb83-e937-4db5-94e3-87f5253b27c9","Type":"ContainerStarted","Data":"5753edd9183e3e98d12c15872e7d1b5d9bb6254984303ef75415f86f899ca667"} Mar 10 15:21:13 crc kubenswrapper[4743]: I0310 15:21:13.308074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktv9r" event={"ID":"175c524a-33de-450a-b171-89128b525741","Type":"ContainerStarted","Data":"0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69"} Mar 10 15:21:13 crc kubenswrapper[4743]: I0310 15:21:13.349969 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ktv9r" podStartSLOduration=1.547593671 podStartE2EDuration="3.349948454s" podCreationTimestamp="2026-03-10 15:21:10 +0000 UTC" firstStartedPulling="2026-03-10 15:21:11.286450884 +0000 UTC m=+935.993265632" lastFinishedPulling="2026-03-10 15:21:13.088805667 +0000 UTC m=+937.795620415" observedRunningTime="2026-03-10 15:21:13.34776003 +0000 UTC m=+938.054574778" watchObservedRunningTime="2026-03-10 15:21:13.349948454 +0000 UTC m=+938.056763222" Mar 10 15:21:13 crc kubenswrapper[4743]: I0310 15:21:13.351138 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-66r4d" podStartSLOduration=0.867219314 podStartE2EDuration="3.351131308s" podCreationTimestamp="2026-03-10 15:21:10 +0000 UTC" firstStartedPulling="2026-03-10 15:21:10.609006942 +0000 UTC m=+935.315821690" lastFinishedPulling="2026-03-10 15:21:13.092918936 +0000 UTC m=+937.799733684" observedRunningTime="2026-03-10 15:21:13.325619967 +0000 UTC m=+938.032434735" watchObservedRunningTime="2026-03-10 15:21:13.351131308 +0000 UTC m=+938.057946066" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.820981 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-mzqq9"] Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.822383 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.825406 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-68sln" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.833127 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-mzqq9"] Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.841400 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-926jh"] Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.842303 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.845111 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.860370 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xgtpr"] Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.861846 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.868426 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-926jh"] Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.953060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t9mb\" (UniqueName: \"kubernetes.io/projected/3532faa1-be5a-4242-8596-aa02e6263960-kube-api-access-2t9mb\") pod \"nmstate-metrics-69594cc75-mzqq9\" (UID: \"3532faa1-be5a-4242-8596-aa02e6263960\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.953141 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-ovs-socket\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.953181 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwcfc\" (UniqueName: \"kubernetes.io/projected/a99b4c91-7907-4849-81f1-47627fe794fa-kube-api-access-gwcfc\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.953225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-dbus-socket\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.953263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-nmstate-lock\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.953289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xswsp\" (UniqueName: \"kubernetes.io/projected/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-kube-api-access-xswsp\") pod \"nmstate-webhook-786f45cff4-926jh\" (UID: \"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.953319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-926jh\" (UID: \"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.976383 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc"] Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.977147 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.981772 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-fg9sb" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.982010 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.982180 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 15:21:18 crc kubenswrapper[4743]: I0310 15:21:18.993655 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc"] Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054558 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t9mb\" (UniqueName: \"kubernetes.io/projected/3532faa1-be5a-4242-8596-aa02e6263960-kube-api-access-2t9mb\") pod \"nmstate-metrics-69594cc75-mzqq9\" (UID: \"3532faa1-be5a-4242-8596-aa02e6263960\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054627 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-ovs-socket\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwcfc\" (UniqueName: \"kubernetes.io/projected/a99b4c91-7907-4849-81f1-47627fe794fa-kube-api-access-gwcfc\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054679 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9jn\" (UniqueName: \"kubernetes.io/projected/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-kube-api-access-ml9jn\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-ovs-socket\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054851 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-dbus-socket\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054879 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-nmstate-lock\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054904 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xswsp\" (UniqueName: \"kubernetes.io/projected/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-kube-api-access-xswsp\") pod \"nmstate-webhook-786f45cff4-926jh\" (UID: \"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-926jh\" (UID: \"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.054977 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-nmstate-lock\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.055115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a99b4c91-7907-4849-81f1-47627fe794fa-dbus-socket\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: E0310 15:21:19.055277 4743 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 10 15:21:19 crc kubenswrapper[4743]: E0310 15:21:19.055436 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-tls-key-pair podName:59af88c0-1e3a-4bf3-8ad3-4d11a0248c70 nodeName:}" failed. No retries permitted until 2026-03-10 15:21:19.555397641 +0000 UTC m=+944.262212569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-tls-key-pair") pod "nmstate-webhook-786f45cff4-926jh" (UID: "59af88c0-1e3a-4bf3-8ad3-4d11a0248c70") : secret "openshift-nmstate-webhook" not found Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.081225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwcfc\" (UniqueName: \"kubernetes.io/projected/a99b4c91-7907-4849-81f1-47627fe794fa-kube-api-access-gwcfc\") pod \"nmstate-handler-xgtpr\" (UID: \"a99b4c91-7907-4849-81f1-47627fe794fa\") " pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.081381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xswsp\" (UniqueName: \"kubernetes.io/projected/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-kube-api-access-xswsp\") pod \"nmstate-webhook-786f45cff4-926jh\" (UID: \"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.082302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t9mb\" (UniqueName: \"kubernetes.io/projected/3532faa1-be5a-4242-8596-aa02e6263960-kube-api-access-2t9mb\") pod \"nmstate-metrics-69594cc75-mzqq9\" (UID: \"3532faa1-be5a-4242-8596-aa02e6263960\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.142368 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.155980 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9jn\" (UniqueName: \"kubernetes.io/projected/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-kube-api-access-ml9jn\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.156038 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.156207 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: E0310 15:21:19.156147 4743 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 10 15:21:19 crc kubenswrapper[4743]: E0310 15:21:19.156371 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-plugin-serving-cert podName:12b52c8d-3a76-46a0-a2fe-6279e2537c9f nodeName:}" failed. No retries permitted until 2026-03-10 15:21:19.656352084 +0000 UTC m=+944.363166832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-74xhc" (UID: "12b52c8d-3a76-46a0-a2fe-6279e2537c9f") : secret "plugin-serving-cert" not found Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.168238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.185886 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9jn\" (UniqueName: \"kubernetes.io/projected/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-kube-api-access-ml9jn\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.186182 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.203206 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77fcfc7895-mmlwl"] Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.204099 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.247617 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77fcfc7895-mmlwl"] Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.257889 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-serving-cert\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.257966 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-oauth-config\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.257994 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-trusted-ca-bundle\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.258015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-config\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.258039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-oauth-serving-cert\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.258068 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49mgb\" (UniqueName: \"kubernetes.io/projected/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-kube-api-access-49mgb\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.258085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-service-ca\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.359180 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xgtpr" event={"ID":"a99b4c91-7907-4849-81f1-47627fe794fa","Type":"ContainerStarted","Data":"14efe7d1b39c8851a8af1471d1044a0aa1e203d59d0de8a72774dbbd02e7f967"} Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.360183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-oauth-config\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.360246 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-trusted-ca-bundle\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.360274 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-config\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.360300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-oauth-serving-cert\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.360344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49mgb\" (UniqueName: \"kubernetes.io/projected/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-kube-api-access-49mgb\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.360364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-service-ca\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.361770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-trusted-ca-bundle\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.361913 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-service-ca\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.361971 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-serving-cert\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.362599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-config\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.364145 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-oauth-serving-cert\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.367284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-oauth-config\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.370722 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-console-serving-cert\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.380158 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49mgb\" (UniqueName: \"kubernetes.io/projected/03c2d2ff-e47b-4747-9214-e9fcd309b3b8-kube-api-access-49mgb\") pod \"console-77fcfc7895-mmlwl\" (UID: \"03c2d2ff-e47b-4747-9214-e9fcd309b3b8\") " pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.458016 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-mzqq9"] Mar 10 15:21:19 crc kubenswrapper[4743]: W0310 15:21:19.462482 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3532faa1_be5a_4242_8596_aa02e6263960.slice/crio-cf672351f10184ad685761941ec9075466e0720d9a2cc0413850500699bd8bb8 WatchSource:0}: Error finding container cf672351f10184ad685761941ec9075466e0720d9a2cc0413850500699bd8bb8: Status 404 returned error can't find the container with id cf672351f10184ad685761941ec9075466e0720d9a2cc0413850500699bd8bb8 Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.539440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.564410 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-926jh\" (UID: \"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.567953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/59af88c0-1e3a-4bf3-8ad3-4d11a0248c70-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-926jh\" (UID: \"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.665852 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.671009 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b52c8d-3a76-46a0-a2fe-6279e2537c9f-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-74xhc\" (UID: \"12b52c8d-3a76-46a0-a2fe-6279e2537c9f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.757694 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.761842 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77fcfc7895-mmlwl"] Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.896644 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" Mar 10 15:21:19 crc kubenswrapper[4743]: I0310 15:21:19.969399 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-926jh"] Mar 10 15:21:19 crc kubenswrapper[4743]: W0310 15:21:19.971629 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59af88c0_1e3a_4bf3_8ad3_4d11a0248c70.slice/crio-dbdd5a903bfbe210c7426d3bd1cfe30b45ee9ca6ef89342c8fee0831c076cfcf WatchSource:0}: Error finding container dbdd5a903bfbe210c7426d3bd1cfe30b45ee9ca6ef89342c8fee0831c076cfcf: Status 404 returned error can't find the container with id dbdd5a903bfbe210c7426d3bd1cfe30b45ee9ca6ef89342c8fee0831c076cfcf Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.129106 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc"] Mar 10 15:21:20 crc kubenswrapper[4743]: W0310 15:21:20.134563 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b52c8d_3a76_46a0_a2fe_6279e2537c9f.slice/crio-a7860fc31d373766d7c7465e50694aadb2a6b6699778eb5e41bec482f7e02045 WatchSource:0}: Error finding container a7860fc31d373766d7c7465e50694aadb2a6b6699778eb5e41bec482f7e02045: Status 404 returned error can't find the container with id a7860fc31d373766d7c7465e50694aadb2a6b6699778eb5e41bec482f7e02045 Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.375736 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" event={"ID":"3532faa1-be5a-4242-8596-aa02e6263960","Type":"ContainerStarted","Data":"cf672351f10184ad685761941ec9075466e0720d9a2cc0413850500699bd8bb8"} Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.376805 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" event={"ID":"12b52c8d-3a76-46a0-a2fe-6279e2537c9f","Type":"ContainerStarted","Data":"a7860fc31d373766d7c7465e50694aadb2a6b6699778eb5e41bec482f7e02045"} Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.378542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77fcfc7895-mmlwl" event={"ID":"03c2d2ff-e47b-4747-9214-e9fcd309b3b8","Type":"ContainerStarted","Data":"8f4a048f19b2851a71d5506a44bcd981361cfc3f349d4dea98795ec14d1b717b"} Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.378613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77fcfc7895-mmlwl" event={"ID":"03c2d2ff-e47b-4747-9214-e9fcd309b3b8","Type":"ContainerStarted","Data":"c36a5ff9cb605e9dc9f01e37bf92286a0c6d7c483b82872a826f2e2358d54f4e"} Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.379641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" event={"ID":"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70","Type":"ContainerStarted","Data":"dbdd5a903bfbe210c7426d3bd1cfe30b45ee9ca6ef89342c8fee0831c076cfcf"} Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.400992 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77fcfc7895-mmlwl" podStartSLOduration=1.400963724 podStartE2EDuration="1.400963724s" podCreationTimestamp="2026-03-10 15:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:21:20.398600415 +0000 UTC m=+945.105415233" watchObservedRunningTime="2026-03-10 15:21:20.400963724 +0000 UTC m=+945.107778472" Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.631293 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.631370 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:20 crc kubenswrapper[4743]: I0310 15:21:20.715330 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:21 crc kubenswrapper[4743]: I0310 15:21:21.126408 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:21 crc kubenswrapper[4743]: I0310 15:21:21.181258 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:21 crc kubenswrapper[4743]: I0310 15:21:21.438520 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:22 crc kubenswrapper[4743]: I0310 15:21:22.351021 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d7jlh"] Mar 10 15:21:22 crc kubenswrapper[4743]: I0310 15:21:22.393875 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d7jlh" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="registry-server" containerID="cri-o://8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c" gracePeriod=2 Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.027285 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.123536 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-utilities\") pod \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.123631 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-catalog-content\") pod \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.123684 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2kp8\" (UniqueName: \"kubernetes.io/projected/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-kube-api-access-l2kp8\") pod \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\" (UID: \"6f3bbc86-5220-42ca-99a1-c02f2cc758c7\") " Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.125420 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-utilities" (OuterVolumeSpecName: "utilities") pod "6f3bbc86-5220-42ca-99a1-c02f2cc758c7" (UID: "6f3bbc86-5220-42ca-99a1-c02f2cc758c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.130894 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-kube-api-access-l2kp8" (OuterVolumeSpecName: "kube-api-access-l2kp8") pod "6f3bbc86-5220-42ca-99a1-c02f2cc758c7" (UID: "6f3bbc86-5220-42ca-99a1-c02f2cc758c7"). InnerVolumeSpecName "kube-api-access-l2kp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.225602 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.225641 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2kp8\" (UniqueName: \"kubernetes.io/projected/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-kube-api-access-l2kp8\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.277981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f3bbc86-5220-42ca-99a1-c02f2cc758c7" (UID: "6f3bbc86-5220-42ca-99a1-c02f2cc758c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.326715 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f3bbc86-5220-42ca-99a1-c02f2cc758c7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.403382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xgtpr" event={"ID":"a99b4c91-7907-4849-81f1-47627fe794fa","Type":"ContainerStarted","Data":"a3b6e1283972d16d6aafa70fef34c2dc0f3b9ac4560e92b3ba5f9c27a9a3d521"} Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.403575 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.405479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" event={"ID":"3532faa1-be5a-4242-8596-aa02e6263960","Type":"ContainerStarted","Data":"0aead4858a92f03acc5053ab9592feb0446c9019afc75ead0e2190c8bf1fd2b3"} Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.407867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" event={"ID":"12b52c8d-3a76-46a0-a2fe-6279e2537c9f","Type":"ContainerStarted","Data":"0fb20c74f77f69100792ec939cc186d5910d68bf0f692ef8b493a4d457699e80"} Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.411891 4743 generic.go:334] "Generic (PLEG): container finished" podID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerID="8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c" exitCode=0 Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.411995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7jlh" event={"ID":"6f3bbc86-5220-42ca-99a1-c02f2cc758c7","Type":"ContainerDied","Data":"8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c"} Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.412020 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7jlh" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.412066 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7jlh" event={"ID":"6f3bbc86-5220-42ca-99a1-c02f2cc758c7","Type":"ContainerDied","Data":"6c6ac52799240a196247e10b73fddf85028dbf03fb7aab18574e07a9ac1701c7"} Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.412105 4743 scope.go:117] "RemoveContainer" containerID="8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.416236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" event={"ID":"59af88c0-1e3a-4bf3-8ad3-4d11a0248c70","Type":"ContainerStarted","Data":"b707f9f75be1e202d5473fc3c1353bfdab88abec5711827c948225d6a786339f"} Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.416417 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.428465 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xgtpr" podStartSLOduration=1.884007665 podStartE2EDuration="5.42843802s" podCreationTimestamp="2026-03-10 15:21:18 +0000 UTC" firstStartedPulling="2026-03-10 15:21:19.340892116 +0000 UTC m=+944.047706864" lastFinishedPulling="2026-03-10 15:21:22.885322471 +0000 UTC m=+947.592137219" observedRunningTime="2026-03-10 15:21:23.426844984 +0000 UTC m=+948.133659742" watchObservedRunningTime="2026-03-10 15:21:23.42843802 +0000 UTC m=+948.135252768" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.442192 4743 scope.go:117] "RemoveContainer" containerID="6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.455758 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-74xhc" podStartSLOduration=2.709644813 podStartE2EDuration="5.455741734s" podCreationTimestamp="2026-03-10 15:21:18 +0000 UTC" firstStartedPulling="2026-03-10 15:21:20.136549882 +0000 UTC m=+944.843364640" lastFinishedPulling="2026-03-10 15:21:22.882646813 +0000 UTC m=+947.589461561" observedRunningTime="2026-03-10 15:21:23.451783899 +0000 UTC m=+948.158598647" watchObservedRunningTime="2026-03-10 15:21:23.455741734 +0000 UTC m=+948.162556482" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.478554 4743 scope.go:117] "RemoveContainer" containerID="39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.491487 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" podStartSLOduration=2.542435345 podStartE2EDuration="5.491467062s" podCreationTimestamp="2026-03-10 15:21:18 +0000 UTC" firstStartedPulling="2026-03-10 15:21:19.974468083 +0000 UTC m=+944.681282831" lastFinishedPulling="2026-03-10 15:21:22.9234998 +0000 UTC m=+947.630314548" observedRunningTime="2026-03-10 15:21:23.483184771 +0000 UTC m=+948.189999549" watchObservedRunningTime="2026-03-10 15:21:23.491467062 +0000 UTC m=+948.198281810" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.508460 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d7jlh"] Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.512475 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d7jlh"] Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.512603 4743 scope.go:117] "RemoveContainer" containerID="8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c" Mar 10 15:21:23 crc kubenswrapper[4743]: E0310 15:21:23.513120 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c\": container with ID starting with 8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c not found: ID does not exist" containerID="8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.513151 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c"} err="failed to get container status \"8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c\": rpc error: code = NotFound desc = could not find container \"8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c\": container with ID starting with 8ba30838efac5a8e23f9b080c8f36dde4bc3dc7945c466b829c4b0d5f517308c not found: ID does not exist" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.513206 4743 scope.go:117] "RemoveContainer" containerID="6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07" Mar 10 15:21:23 crc kubenswrapper[4743]: E0310 15:21:23.513544 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07\": container with ID starting with 6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07 not found: ID does not exist" containerID="6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.513584 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07"} err="failed to get container status \"6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07\": rpc error: code = NotFound desc = could not find container \"6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07\": container with ID starting with 6ab5e63ab2311380a3d7dd46dc13a091fc660d9719302fe8ae92cb7303862f07 not found: ID does not exist" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.513621 4743 scope.go:117] "RemoveContainer" containerID="39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750" Mar 10 15:21:23 crc kubenswrapper[4743]: E0310 15:21:23.513959 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750\": container with ID starting with 39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750 not found: ID does not exist" containerID="39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.513996 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750"} err="failed to get container status \"39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750\": rpc error: code = NotFound desc = could not find container \"39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750\": container with ID starting with 39fb65520f5cda587537f804171edd2ace20636a4ca6648988f89a25bbf96750 not found: ID does not exist" Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.750420 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktv9r"] Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.750699 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ktv9r" podUID="175c524a-33de-450a-b171-89128b525741" containerName="registry-server" containerID="cri-o://0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69" gracePeriod=2 Mar 10 15:21:23 crc kubenswrapper[4743]: I0310 15:21:23.924227 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" path="/var/lib/kubelet/pods/6f3bbc86-5220-42ca-99a1-c02f2cc758c7/volumes" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.188540 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.346163 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-utilities\") pod \"175c524a-33de-450a-b171-89128b525741\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.346333 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-catalog-content\") pod \"175c524a-33de-450a-b171-89128b525741\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.346378 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94zp7\" (UniqueName: \"kubernetes.io/projected/175c524a-33de-450a-b171-89128b525741-kube-api-access-94zp7\") pod \"175c524a-33de-450a-b171-89128b525741\" (UID: \"175c524a-33de-450a-b171-89128b525741\") " Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.347115 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-utilities" (OuterVolumeSpecName: "utilities") pod "175c524a-33de-450a-b171-89128b525741" (UID: "175c524a-33de-450a-b171-89128b525741"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.352371 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175c524a-33de-450a-b171-89128b525741-kube-api-access-94zp7" (OuterVolumeSpecName: "kube-api-access-94zp7") pod "175c524a-33de-450a-b171-89128b525741" (UID: "175c524a-33de-450a-b171-89128b525741"). InnerVolumeSpecName "kube-api-access-94zp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.377280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "175c524a-33de-450a-b171-89128b525741" (UID: "175c524a-33de-450a-b171-89128b525741"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.426725 4743 generic.go:334] "Generic (PLEG): container finished" podID="175c524a-33de-450a-b171-89128b525741" containerID="0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69" exitCode=0 Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.426829 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktv9r" event={"ID":"175c524a-33de-450a-b171-89128b525741","Type":"ContainerDied","Data":"0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69"} Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.426963 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktv9r" event={"ID":"175c524a-33de-450a-b171-89128b525741","Type":"ContainerDied","Data":"ad4770b6a592176c6e4c49bf0775692c60e698165d48a26d7e336bd762af7f3c"} Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.426987 4743 scope.go:117] "RemoveContainer" containerID="0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.426803 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktv9r" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.443688 4743 scope.go:117] "RemoveContainer" containerID="d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.448200 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.448235 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94zp7\" (UniqueName: \"kubernetes.io/projected/175c524a-33de-450a-b171-89128b525741-kube-api-access-94zp7\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.448247 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175c524a-33de-450a-b171-89128b525741-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.458508 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktv9r"] Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.461963 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktv9r"] Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.464717 4743 scope.go:117] "RemoveContainer" containerID="4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.484310 4743 scope.go:117] "RemoveContainer" containerID="0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69" Mar 10 15:21:24 crc kubenswrapper[4743]: E0310 15:21:24.485130 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69\": container with ID starting with 0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69 not found: ID does not exist" containerID="0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.485180 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69"} err="failed to get container status \"0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69\": rpc error: code = NotFound desc = could not find container \"0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69\": container with ID starting with 0cf5e6366e87ecbba98953d6d785e459115be3518dc5dcdfcb04ae885894ff69 not found: ID does not exist" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.485209 4743 scope.go:117] "RemoveContainer" containerID="d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450" Mar 10 15:21:24 crc kubenswrapper[4743]: E0310 15:21:24.486060 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450\": container with ID starting with d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450 not found: ID does not exist" containerID="d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.486092 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450"} err="failed to get container status \"d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450\": rpc error: code = NotFound desc = could not find container \"d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450\": container with ID starting with d1bd409598c9d845203ae71cff08079def14eb73409bba126f8d4aa0b2317450 not found: ID does not exist" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.486110 4743 scope.go:117] "RemoveContainer" containerID="4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc" Mar 10 15:21:24 crc kubenswrapper[4743]: E0310 15:21:24.486767 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc\": container with ID starting with 4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc not found: ID does not exist" containerID="4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc" Mar 10 15:21:24 crc kubenswrapper[4743]: I0310 15:21:24.486792 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc"} err="failed to get container status \"4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc\": rpc error: code = NotFound desc = could not find container \"4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc\": container with ID starting with 4baa2e79ebf19c97fcf43d636df658f839445b3c434744990b6fc11cee6ddacc not found: ID does not exist" Mar 10 15:21:25 crc kubenswrapper[4743]: I0310 15:21:25.437371 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" event={"ID":"3532faa1-be5a-4242-8596-aa02e6263960","Type":"ContainerStarted","Data":"85826a2d042ab52094e06afa7d1df8760b5ef741938b1a1f51e1c3c68c351ae1"} Mar 10 15:21:25 crc kubenswrapper[4743]: I0310 15:21:25.922889 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175c524a-33de-450a-b171-89128b525741" path="/var/lib/kubelet/pods/175c524a-33de-450a-b171-89128b525741/volumes" Mar 10 15:21:29 crc kubenswrapper[4743]: I0310 15:21:29.215905 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xgtpr" Mar 10 15:21:29 crc kubenswrapper[4743]: I0310 15:21:29.235035 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-mzqq9" podStartSLOduration=5.440338906 podStartE2EDuration="11.235003978s" podCreationTimestamp="2026-03-10 15:21:18 +0000 UTC" firstStartedPulling="2026-03-10 15:21:19.464556138 +0000 UTC m=+944.171370886" lastFinishedPulling="2026-03-10 15:21:25.25922121 +0000 UTC m=+949.966035958" observedRunningTime="2026-03-10 15:21:25.456249334 +0000 UTC m=+950.163064082" watchObservedRunningTime="2026-03-10 15:21:29.235003978 +0000 UTC m=+953.941818766" Mar 10 15:21:29 crc kubenswrapper[4743]: I0310 15:21:29.540632 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:29 crc kubenswrapper[4743]: I0310 15:21:29.540709 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:29 crc kubenswrapper[4743]: I0310 15:21:29.544769 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:30 crc kubenswrapper[4743]: I0310 15:21:30.482692 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77fcfc7895-mmlwl" Mar 10 15:21:30 crc kubenswrapper[4743]: I0310 15:21:30.538780 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v9bc6"] Mar 10 15:21:39 crc kubenswrapper[4743]: I0310 15:21:39.768692 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-926jh" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.438860 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-269dv"] Mar 10 15:21:40 crc kubenswrapper[4743]: E0310 15:21:40.439488 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="extract-utilities" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.439503 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="extract-utilities" Mar 10 15:21:40 crc kubenswrapper[4743]: E0310 15:21:40.439516 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="extract-content" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.439522 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="extract-content" Mar 10 15:21:40 crc kubenswrapper[4743]: E0310 15:21:40.439534 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="registry-server" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.439540 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="registry-server" Mar 10 15:21:40 crc kubenswrapper[4743]: E0310 15:21:40.439554 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175c524a-33de-450a-b171-89128b525741" containerName="extract-content" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.439560 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="175c524a-33de-450a-b171-89128b525741" containerName="extract-content" Mar 10 15:21:40 crc kubenswrapper[4743]: E0310 15:21:40.439567 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175c524a-33de-450a-b171-89128b525741" containerName="extract-utilities" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.439573 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="175c524a-33de-450a-b171-89128b525741" containerName="extract-utilities" Mar 10 15:21:40 crc kubenswrapper[4743]: E0310 15:21:40.439583 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175c524a-33de-450a-b171-89128b525741" containerName="registry-server" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.439589 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="175c524a-33de-450a-b171-89128b525741" containerName="registry-server" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.439679 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="175c524a-33de-450a-b171-89128b525741" containerName="registry-server" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.439698 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3bbc86-5220-42ca-99a1-c02f2cc758c7" containerName="registry-server" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.440489 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.459471 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-269dv"] Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.497197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr62x\" (UniqueName: \"kubernetes.io/projected/36286c80-d36a-4232-be43-da4096181726-kube-api-access-zr62x\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.497263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-utilities\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.497641 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-catalog-content\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.598803 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr62x\" (UniqueName: \"kubernetes.io/projected/36286c80-d36a-4232-be43-da4096181726-kube-api-access-zr62x\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.598912 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-utilities\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.598975 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-catalog-content\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.599840 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-utilities\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.599885 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-catalog-content\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.629531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr62x\" (UniqueName: \"kubernetes.io/projected/36286c80-d36a-4232-be43-da4096181726-kube-api-access-zr62x\") pod \"certified-operators-269dv\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.762359 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:40 crc kubenswrapper[4743]: I0310 15:21:40.996848 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-269dv"] Mar 10 15:21:41 crc kubenswrapper[4743]: I0310 15:21:41.552583 4743 generic.go:334] "Generic (PLEG): container finished" podID="36286c80-d36a-4232-be43-da4096181726" containerID="1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618" exitCode=0 Mar 10 15:21:41 crc kubenswrapper[4743]: I0310 15:21:41.552700 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-269dv" event={"ID":"36286c80-d36a-4232-be43-da4096181726","Type":"ContainerDied","Data":"1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618"} Mar 10 15:21:41 crc kubenswrapper[4743]: I0310 15:21:41.552940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-269dv" event={"ID":"36286c80-d36a-4232-be43-da4096181726","Type":"ContainerStarted","Data":"bdca232f9b62fd2e2c6d61336f1f379eca3219976ee21c83c244c999e651c9d4"} Mar 10 15:21:42 crc kubenswrapper[4743]: I0310 15:21:42.561759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-269dv" event={"ID":"36286c80-d36a-4232-be43-da4096181726","Type":"ContainerStarted","Data":"29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe"} Mar 10 15:21:43 crc kubenswrapper[4743]: I0310 15:21:43.573598 4743 generic.go:334] "Generic (PLEG): container finished" podID="36286c80-d36a-4232-be43-da4096181726" containerID="29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe" exitCode=0 Mar 10 15:21:43 crc kubenswrapper[4743]: I0310 15:21:43.573655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-269dv" event={"ID":"36286c80-d36a-4232-be43-da4096181726","Type":"ContainerDied","Data":"29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe"} Mar 10 15:21:44 crc kubenswrapper[4743]: I0310 15:21:44.581408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-269dv" event={"ID":"36286c80-d36a-4232-be43-da4096181726","Type":"ContainerStarted","Data":"f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88"} Mar 10 15:21:44 crc kubenswrapper[4743]: I0310 15:21:44.608220 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-269dv" podStartSLOduration=1.926249292 podStartE2EDuration="4.60819689s" podCreationTimestamp="2026-03-10 15:21:40 +0000 UTC" firstStartedPulling="2026-03-10 15:21:41.555231723 +0000 UTC m=+966.262046471" lastFinishedPulling="2026-03-10 15:21:44.237179311 +0000 UTC m=+968.943994069" observedRunningTime="2026-03-10 15:21:44.60510774 +0000 UTC m=+969.311922488" watchObservedRunningTime="2026-03-10 15:21:44.60819689 +0000 UTC m=+969.315011638" Mar 10 15:21:50 crc kubenswrapper[4743]: I0310 15:21:50.762712 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:50 crc kubenswrapper[4743]: I0310 15:21:50.763509 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:50 crc kubenswrapper[4743]: I0310 15:21:50.833791 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:51 crc kubenswrapper[4743]: I0310 15:21:51.671966 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:51 crc kubenswrapper[4743]: I0310 15:21:51.732929 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-269dv"] Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.514779 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5"] Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.516516 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.523195 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.531076 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5"] Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.591415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.591528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.591776 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k98w5\" (UniqueName: \"kubernetes.io/projected/04d8c366-53ae-4237-a441-9acd9c158909-kube-api-access-k98w5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.642131 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-269dv" podUID="36286c80-d36a-4232-be43-da4096181726" containerName="registry-server" containerID="cri-o://f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88" gracePeriod=2 Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.694109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.694257 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k98w5\" (UniqueName: \"kubernetes.io/projected/04d8c366-53ae-4237-a441-9acd9c158909-kube-api-access-k98w5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.694401 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.694898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.694907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.715241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k98w5\" (UniqueName: \"kubernetes.io/projected/04d8c366-53ae-4237-a441-9acd9c158909-kube-api-access-k98w5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:53 crc kubenswrapper[4743]: I0310 15:21:53.838657 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.014913 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.068654 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5"] Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.099207 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-utilities\") pod \"36286c80-d36a-4232-be43-da4096181726\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.099298 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-catalog-content\") pod \"36286c80-d36a-4232-be43-da4096181726\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.099351 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr62x\" (UniqueName: \"kubernetes.io/projected/36286c80-d36a-4232-be43-da4096181726-kube-api-access-zr62x\") pod \"36286c80-d36a-4232-be43-da4096181726\" (UID: \"36286c80-d36a-4232-be43-da4096181726\") " Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.106387 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36286c80-d36a-4232-be43-da4096181726-kube-api-access-zr62x" (OuterVolumeSpecName: "kube-api-access-zr62x") pod "36286c80-d36a-4232-be43-da4096181726" (UID: "36286c80-d36a-4232-be43-da4096181726"). InnerVolumeSpecName "kube-api-access-zr62x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.111635 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-utilities" (OuterVolumeSpecName: "utilities") pod "36286c80-d36a-4232-be43-da4096181726" (UID: "36286c80-d36a-4232-be43-da4096181726"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.166295 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36286c80-d36a-4232-be43-da4096181726" (UID: "36286c80-d36a-4232-be43-da4096181726"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.200909 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.200962 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr62x\" (UniqueName: \"kubernetes.io/projected/36286c80-d36a-4232-be43-da4096181726-kube-api-access-zr62x\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.200979 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36286c80-d36a-4232-be43-da4096181726-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.653306 4743 generic.go:334] "Generic (PLEG): container finished" podID="36286c80-d36a-4232-be43-da4096181726" containerID="f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88" exitCode=0 Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.653421 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-269dv" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.653450 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-269dv" event={"ID":"36286c80-d36a-4232-be43-da4096181726","Type":"ContainerDied","Data":"f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88"} Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.653560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-269dv" event={"ID":"36286c80-d36a-4232-be43-da4096181726","Type":"ContainerDied","Data":"bdca232f9b62fd2e2c6d61336f1f379eca3219976ee21c83c244c999e651c9d4"} Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.653600 4743 scope.go:117] "RemoveContainer" containerID="f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.655253 4743 generic.go:334] "Generic (PLEG): container finished" podID="04d8c366-53ae-4237-a441-9acd9c158909" containerID="4730f915841e9d5a338716b129e81624e03e0cb0d17a838bffe1ff8a4bd2120a" exitCode=0 Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.655328 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" event={"ID":"04d8c366-53ae-4237-a441-9acd9c158909","Type":"ContainerDied","Data":"4730f915841e9d5a338716b129e81624e03e0cb0d17a838bffe1ff8a4bd2120a"} Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.655367 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" event={"ID":"04d8c366-53ae-4237-a441-9acd9c158909","Type":"ContainerStarted","Data":"901fa2b01a6f4ce013e250a6ba402942a108b6dd1372b2bed6ac9e88a5981bfd"} Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.678470 4743 scope.go:117] "RemoveContainer" containerID="29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.707635 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-269dv"] Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.713645 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-269dv"] Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.714387 4743 scope.go:117] "RemoveContainer" containerID="1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.728559 4743 scope.go:117] "RemoveContainer" containerID="f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88" Mar 10 15:21:54 crc kubenswrapper[4743]: E0310 15:21:54.729258 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88\": container with ID starting with f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88 not found: ID does not exist" containerID="f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.729315 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88"} err="failed to get container status \"f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88\": rpc error: code = NotFound desc = could not find container \"f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88\": container with ID starting with f1536eaf751d3e55cb68d24e7cec19b6bbcebe799db9ddb0b9bfbe41706d8c88 not found: ID does not exist" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.729342 4743 scope.go:117] "RemoveContainer" containerID="29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe" Mar 10 15:21:54 crc kubenswrapper[4743]: E0310 15:21:54.729879 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe\": container with ID starting with 29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe not found: ID does not exist" containerID="29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.729926 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe"} err="failed to get container status \"29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe\": rpc error: code = NotFound desc = could not find container \"29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe\": container with ID starting with 29bf480c595eae620f49986ef6074ce4e324409e79a01397b8d6cf4d779ff7fe not found: ID does not exist" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.729954 4743 scope.go:117] "RemoveContainer" containerID="1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618" Mar 10 15:21:54 crc kubenswrapper[4743]: E0310 15:21:54.730362 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618\": container with ID starting with 1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618 not found: ID does not exist" containerID="1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618" Mar 10 15:21:54 crc kubenswrapper[4743]: I0310 15:21:54.730389 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618"} err="failed to get container status \"1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618\": rpc error: code = NotFound desc = could not find container \"1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618\": container with ID starting with 1dbfbffd39159295ae3f667f1b3f537a48086f6d739862d041928d9562d07618 not found: ID does not exist" Mar 10 15:21:55 crc kubenswrapper[4743]: I0310 15:21:55.585975 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-v9bc6" podUID="bc7402d9-c20f-4429-bda9-db2b1ccddf8e" containerName="console" containerID="cri-o://b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6" gracePeriod=15 Mar 10 15:21:55 crc kubenswrapper[4743]: I0310 15:21:55.939968 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36286c80-d36a-4232-be43-da4096181726" path="/var/lib/kubelet/pods/36286c80-d36a-4232-be43-da4096181726/volumes" Mar 10 15:21:55 crc kubenswrapper[4743]: I0310 15:21:55.994704 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v9bc6_bc7402d9-c20f-4429-bda9-db2b1ccddf8e/console/0.log" Mar 10 15:21:55 crc kubenswrapper[4743]: I0310 15:21:55.994784 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.131968 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chd4m\" (UniqueName: \"kubernetes.io/projected/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-kube-api-access-chd4m\") pod \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.132040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-config\") pod \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.132100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-oauth-serving-cert\") pod \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.133146 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bc7402d9-c20f-4429-bda9-db2b1ccddf8e" (UID: "bc7402d9-c20f-4429-bda9-db2b1ccddf8e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.133163 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-config" (OuterVolumeSpecName: "console-config") pod "bc7402d9-c20f-4429-bda9-db2b1ccddf8e" (UID: "bc7402d9-c20f-4429-bda9-db2b1ccddf8e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.133235 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-serving-cert\") pod \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.133319 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-service-ca\") pod \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.133985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-service-ca" (OuterVolumeSpecName: "service-ca") pod "bc7402d9-c20f-4429-bda9-db2b1ccddf8e" (UID: "bc7402d9-c20f-4429-bda9-db2b1ccddf8e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.134059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-trusted-ca-bundle\") pod \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.134115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-oauth-config\") pod \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\" (UID: \"bc7402d9-c20f-4429-bda9-db2b1ccddf8e\") " Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.134419 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.134499 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.134520 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.134720 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bc7402d9-c20f-4429-bda9-db2b1ccddf8e" (UID: "bc7402d9-c20f-4429-bda9-db2b1ccddf8e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.138695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bc7402d9-c20f-4429-bda9-db2b1ccddf8e" (UID: "bc7402d9-c20f-4429-bda9-db2b1ccddf8e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.139182 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-kube-api-access-chd4m" (OuterVolumeSpecName: "kube-api-access-chd4m") pod "bc7402d9-c20f-4429-bda9-db2b1ccddf8e" (UID: "bc7402d9-c20f-4429-bda9-db2b1ccddf8e"). InnerVolumeSpecName "kube-api-access-chd4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.139535 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bc7402d9-c20f-4429-bda9-db2b1ccddf8e" (UID: "bc7402d9-c20f-4429-bda9-db2b1ccddf8e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.235928 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.236412 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chd4m\" (UniqueName: \"kubernetes.io/projected/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-kube-api-access-chd4m\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.236486 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.236503 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc7402d9-c20f-4429-bda9-db2b1ccddf8e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.673056 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v9bc6_bc7402d9-c20f-4429-bda9-db2b1ccddf8e/console/0.log" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.673126 4743 generic.go:334] "Generic (PLEG): container finished" podID="bc7402d9-c20f-4429-bda9-db2b1ccddf8e" containerID="b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6" exitCode=2 Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.673202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v9bc6" event={"ID":"bc7402d9-c20f-4429-bda9-db2b1ccddf8e","Type":"ContainerDied","Data":"b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6"} Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.673238 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v9bc6" event={"ID":"bc7402d9-c20f-4429-bda9-db2b1ccddf8e","Type":"ContainerDied","Data":"ceb2c78694f440ee9a33dd83374acb9173a88e5e0e9b538ca0b58dae64f1f974"} Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.673251 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v9bc6" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.673260 4743 scope.go:117] "RemoveContainer" containerID="b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.676444 4743 generic.go:334] "Generic (PLEG): container finished" podID="04d8c366-53ae-4237-a441-9acd9c158909" containerID="882e535ed9c248d0bad7591881abaddd61fa38a5911ac8048e88d9a7a0d64b49" exitCode=0 Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.676517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" event={"ID":"04d8c366-53ae-4237-a441-9acd9c158909","Type":"ContainerDied","Data":"882e535ed9c248d0bad7591881abaddd61fa38a5911ac8048e88d9a7a0d64b49"} Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.706772 4743 scope.go:117] "RemoveContainer" containerID="b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6" Mar 10 15:21:56 crc kubenswrapper[4743]: E0310 15:21:56.707475 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6\": container with ID starting with b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6 not found: ID does not exist" containerID="b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.707522 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6"} err="failed to get container status \"b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6\": rpc error: code = NotFound desc = could not find container \"b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6\": container with ID starting with b476b361db6e26509286440d8400e889ace6af7d681d9128cba8c5b5a7d336c6 not found: ID does not exist" Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.724393 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v9bc6"] Mar 10 15:21:56 crc kubenswrapper[4743]: I0310 15:21:56.729383 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-v9bc6"] Mar 10 15:21:57 crc kubenswrapper[4743]: I0310 15:21:57.688370 4743 generic.go:334] "Generic (PLEG): container finished" podID="04d8c366-53ae-4237-a441-9acd9c158909" containerID="109c6d43256947303686723a1a71ff7982c5acdcb0379e4d7f88312e14cb7c71" exitCode=0 Mar 10 15:21:57 crc kubenswrapper[4743]: I0310 15:21:57.688440 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" event={"ID":"04d8c366-53ae-4237-a441-9acd9c158909","Type":"ContainerDied","Data":"109c6d43256947303686723a1a71ff7982c5acdcb0379e4d7f88312e14cb7c71"} Mar 10 15:21:57 crc kubenswrapper[4743]: I0310 15:21:57.934159 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc7402d9-c20f-4429-bda9-db2b1ccddf8e" path="/var/lib/kubelet/pods/bc7402d9-c20f-4429-bda9-db2b1ccddf8e/volumes" Mar 10 15:21:58 crc kubenswrapper[4743]: I0310 15:21:58.977779 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.085945 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k98w5\" (UniqueName: \"kubernetes.io/projected/04d8c366-53ae-4237-a441-9acd9c158909-kube-api-access-k98w5\") pod \"04d8c366-53ae-4237-a441-9acd9c158909\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.086086 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-util\") pod \"04d8c366-53ae-4237-a441-9acd9c158909\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.086162 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-bundle\") pod \"04d8c366-53ae-4237-a441-9acd9c158909\" (UID: \"04d8c366-53ae-4237-a441-9acd9c158909\") " Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.087414 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-bundle" (OuterVolumeSpecName: "bundle") pod "04d8c366-53ae-4237-a441-9acd9c158909" (UID: "04d8c366-53ae-4237-a441-9acd9c158909"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.095148 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d8c366-53ae-4237-a441-9acd9c158909-kube-api-access-k98w5" (OuterVolumeSpecName: "kube-api-access-k98w5") pod "04d8c366-53ae-4237-a441-9acd9c158909" (UID: "04d8c366-53ae-4237-a441-9acd9c158909"). InnerVolumeSpecName "kube-api-access-k98w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.102100 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-util" (OuterVolumeSpecName: "util") pod "04d8c366-53ae-4237-a441-9acd9c158909" (UID: "04d8c366-53ae-4237-a441-9acd9c158909"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.190995 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k98w5\" (UniqueName: \"kubernetes.io/projected/04d8c366-53ae-4237-a441-9acd9c158909-kube-api-access-k98w5\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.191028 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-util\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.191038 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d8c366-53ae-4237-a441-9acd9c158909-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.711376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" event={"ID":"04d8c366-53ae-4237-a441-9acd9c158909","Type":"ContainerDied","Data":"901fa2b01a6f4ce013e250a6ba402942a108b6dd1372b2bed6ac9e88a5981bfd"} Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.711482 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5" Mar 10 15:21:59 crc kubenswrapper[4743]: I0310 15:21:59.711463 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901fa2b01a6f4ce013e250a6ba402942a108b6dd1372b2bed6ac9e88a5981bfd" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147108 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552602-msfrq"] Mar 10 15:22:00 crc kubenswrapper[4743]: E0310 15:22:00.147443 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d8c366-53ae-4237-a441-9acd9c158909" containerName="util" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147458 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d8c366-53ae-4237-a441-9acd9c158909" containerName="util" Mar 10 15:22:00 crc kubenswrapper[4743]: E0310 15:22:00.147473 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36286c80-d36a-4232-be43-da4096181726" containerName="extract-utilities" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147479 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="36286c80-d36a-4232-be43-da4096181726" containerName="extract-utilities" Mar 10 15:22:00 crc kubenswrapper[4743]: E0310 15:22:00.147487 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7402d9-c20f-4429-bda9-db2b1ccddf8e" containerName="console" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147496 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7402d9-c20f-4429-bda9-db2b1ccddf8e" containerName="console" Mar 10 15:22:00 crc kubenswrapper[4743]: E0310 15:22:00.147505 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36286c80-d36a-4232-be43-da4096181726" containerName="registry-server" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147510 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="36286c80-d36a-4232-be43-da4096181726" containerName="registry-server" Mar 10 15:22:00 crc kubenswrapper[4743]: E0310 15:22:00.147524 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d8c366-53ae-4237-a441-9acd9c158909" containerName="pull" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147530 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d8c366-53ae-4237-a441-9acd9c158909" containerName="pull" Mar 10 15:22:00 crc kubenswrapper[4743]: E0310 15:22:00.147545 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36286c80-d36a-4232-be43-da4096181726" containerName="extract-content" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147551 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="36286c80-d36a-4232-be43-da4096181726" containerName="extract-content" Mar 10 15:22:00 crc kubenswrapper[4743]: E0310 15:22:00.147562 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d8c366-53ae-4237-a441-9acd9c158909" containerName="extract" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147570 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d8c366-53ae-4237-a441-9acd9c158909" containerName="extract" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147678 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="36286c80-d36a-4232-be43-da4096181726" containerName="registry-server" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147689 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc7402d9-c20f-4429-bda9-db2b1ccddf8e" containerName="console" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.147704 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d8c366-53ae-4237-a441-9acd9c158909" containerName="extract" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.148232 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552602-msfrq" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.155192 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.156035 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.156469 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.164762 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552602-msfrq"] Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.309374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8xtw\" (UniqueName: \"kubernetes.io/projected/b92e45e9-8db6-4fde-aca3-2d0c5024d77e-kube-api-access-x8xtw\") pod \"auto-csr-approver-29552602-msfrq\" (UID: \"b92e45e9-8db6-4fde-aca3-2d0c5024d77e\") " pod="openshift-infra/auto-csr-approver-29552602-msfrq" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.411205 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8xtw\" (UniqueName: \"kubernetes.io/projected/b92e45e9-8db6-4fde-aca3-2d0c5024d77e-kube-api-access-x8xtw\") pod \"auto-csr-approver-29552602-msfrq\" (UID: \"b92e45e9-8db6-4fde-aca3-2d0c5024d77e\") " pod="openshift-infra/auto-csr-approver-29552602-msfrq" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.432226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8xtw\" (UniqueName: \"kubernetes.io/projected/b92e45e9-8db6-4fde-aca3-2d0c5024d77e-kube-api-access-x8xtw\") pod \"auto-csr-approver-29552602-msfrq\" (UID: \"b92e45e9-8db6-4fde-aca3-2d0c5024d77e\") " pod="openshift-infra/auto-csr-approver-29552602-msfrq" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.465269 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552602-msfrq" Mar 10 15:22:00 crc kubenswrapper[4743]: I0310 15:22:00.702619 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552602-msfrq"] Mar 10 15:22:00 crc kubenswrapper[4743]: W0310 15:22:00.714151 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92e45e9_8db6_4fde_aca3_2d0c5024d77e.slice/crio-3a9959a875d6ff1580409d3dc9e20f1d132da1e12afffdc48f02ea4fd46c209b WatchSource:0}: Error finding container 3a9959a875d6ff1580409d3dc9e20f1d132da1e12afffdc48f02ea4fd46c209b: Status 404 returned error can't find the container with id 3a9959a875d6ff1580409d3dc9e20f1d132da1e12afffdc48f02ea4fd46c209b Mar 10 15:22:01 crc kubenswrapper[4743]: I0310 15:22:01.730590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552602-msfrq" event={"ID":"b92e45e9-8db6-4fde-aca3-2d0c5024d77e","Type":"ContainerStarted","Data":"3a9959a875d6ff1580409d3dc9e20f1d132da1e12afffdc48f02ea4fd46c209b"} Mar 10 15:22:02 crc kubenswrapper[4743]: I0310 15:22:02.740388 4743 generic.go:334] "Generic (PLEG): container finished" podID="b92e45e9-8db6-4fde-aca3-2d0c5024d77e" containerID="70764383e6ce924a21c83bd9b4931b0226da63de849bc2f595fe0103e5771bbd" exitCode=0 Mar 10 15:22:02 crc kubenswrapper[4743]: I0310 15:22:02.740497 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552602-msfrq" event={"ID":"b92e45e9-8db6-4fde-aca3-2d0c5024d77e","Type":"ContainerDied","Data":"70764383e6ce924a21c83bd9b4931b0226da63de849bc2f595fe0103e5771bbd"} Mar 10 15:22:04 crc kubenswrapper[4743]: I0310 15:22:04.050126 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552602-msfrq" Mar 10 15:22:04 crc kubenswrapper[4743]: I0310 15:22:04.169716 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8xtw\" (UniqueName: \"kubernetes.io/projected/b92e45e9-8db6-4fde-aca3-2d0c5024d77e-kube-api-access-x8xtw\") pod \"b92e45e9-8db6-4fde-aca3-2d0c5024d77e\" (UID: \"b92e45e9-8db6-4fde-aca3-2d0c5024d77e\") " Mar 10 15:22:04 crc kubenswrapper[4743]: I0310 15:22:04.175954 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92e45e9-8db6-4fde-aca3-2d0c5024d77e-kube-api-access-x8xtw" (OuterVolumeSpecName: "kube-api-access-x8xtw") pod "b92e45e9-8db6-4fde-aca3-2d0c5024d77e" (UID: "b92e45e9-8db6-4fde-aca3-2d0c5024d77e"). InnerVolumeSpecName "kube-api-access-x8xtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:22:04 crc kubenswrapper[4743]: I0310 15:22:04.272060 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8xtw\" (UniqueName: \"kubernetes.io/projected/b92e45e9-8db6-4fde-aca3-2d0c5024d77e-kube-api-access-x8xtw\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:04 crc kubenswrapper[4743]: I0310 15:22:04.756045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552602-msfrq" event={"ID":"b92e45e9-8db6-4fde-aca3-2d0c5024d77e","Type":"ContainerDied","Data":"3a9959a875d6ff1580409d3dc9e20f1d132da1e12afffdc48f02ea4fd46c209b"} Mar 10 15:22:04 crc kubenswrapper[4743]: I0310 15:22:04.756113 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a9959a875d6ff1580409d3dc9e20f1d132da1e12afffdc48f02ea4fd46c209b" Mar 10 15:22:04 crc kubenswrapper[4743]: I0310 15:22:04.756216 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552602-msfrq" Mar 10 15:22:05 crc kubenswrapper[4743]: I0310 15:22:05.123137 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-8pxvn"] Mar 10 15:22:05 crc kubenswrapper[4743]: I0310 15:22:05.129503 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-8pxvn"] Mar 10 15:22:05 crc kubenswrapper[4743]: I0310 15:22:05.922894 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8036ca91-3733-4e07-a583-4265f162c6ed" path="/var/lib/kubelet/pods/8036ca91-3733-4e07-a583-4265f162c6ed/volumes" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.153450 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg"] Mar 10 15:22:09 crc kubenswrapper[4743]: E0310 15:22:09.155671 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92e45e9-8db6-4fde-aca3-2d0c5024d77e" containerName="oc" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.155758 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92e45e9-8db6-4fde-aca3-2d0c5024d77e" containerName="oc" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.155965 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92e45e9-8db6-4fde-aca3-2d0c5024d77e" containerName="oc" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.156729 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.159231 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.159232 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7gz7p" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.159922 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.160136 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.160318 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.171730 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg"] Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.241263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7df72fd9-f705-42a8-b630-88cdf35f8874-apiservice-cert\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.241728 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7df72fd9-f705-42a8-b630-88cdf35f8874-webhook-cert\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.241750 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndtpx\" (UniqueName: \"kubernetes.io/projected/7df72fd9-f705-42a8-b630-88cdf35f8874-kube-api-access-ndtpx\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.342732 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7df72fd9-f705-42a8-b630-88cdf35f8874-webhook-cert\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.342786 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndtpx\" (UniqueName: \"kubernetes.io/projected/7df72fd9-f705-42a8-b630-88cdf35f8874-kube-api-access-ndtpx\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.342895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7df72fd9-f705-42a8-b630-88cdf35f8874-apiservice-cert\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.350350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7df72fd9-f705-42a8-b630-88cdf35f8874-webhook-cert\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.351403 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7df72fd9-f705-42a8-b630-88cdf35f8874-apiservice-cert\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.373926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndtpx\" (UniqueName: \"kubernetes.io/projected/7df72fd9-f705-42a8-b630-88cdf35f8874-kube-api-access-ndtpx\") pod \"metallb-operator-controller-manager-85fcd4c69f-jxzhg\" (UID: \"7df72fd9-f705-42a8-b630-88cdf35f8874\") " pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.476138 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.493185 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95"] Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.494210 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.496088 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2szfm" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.496665 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.497309 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.512243 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95"] Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.647987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43bd5d43-af77-493a-a93e-0110cfd5c307-webhook-cert\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.648372 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7flf2\" (UniqueName: \"kubernetes.io/projected/43bd5d43-af77-493a-a93e-0110cfd5c307-kube-api-access-7flf2\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.648406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43bd5d43-af77-493a-a93e-0110cfd5c307-apiservice-cert\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.749398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43bd5d43-af77-493a-a93e-0110cfd5c307-webhook-cert\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.749464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7flf2\" (UniqueName: \"kubernetes.io/projected/43bd5d43-af77-493a-a93e-0110cfd5c307-kube-api-access-7flf2\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.749494 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43bd5d43-af77-493a-a93e-0110cfd5c307-apiservice-cert\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.759211 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43bd5d43-af77-493a-a93e-0110cfd5c307-apiservice-cert\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.774511 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7flf2\" (UniqueName: \"kubernetes.io/projected/43bd5d43-af77-493a-a93e-0110cfd5c307-kube-api-access-7flf2\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.792589 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43bd5d43-af77-493a-a93e-0110cfd5c307-webhook-cert\") pod \"metallb-operator-webhook-server-66b54d9848-tkr95\" (UID: \"43bd5d43-af77-493a-a93e-0110cfd5c307\") " pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.826629 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg"] Mar 10 15:22:09 crc kubenswrapper[4743]: I0310 15:22:09.862774 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.333650 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95"] Mar 10 15:22:10 crc kubenswrapper[4743]: W0310 15:22:10.341885 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43bd5d43_af77_493a_a93e_0110cfd5c307.slice/crio-6ff8c7ee6c29c4f492c62d5be90d9f9ed3d1ea77ba98c9462ad5f89a7906f163 WatchSource:0}: Error finding container 6ff8c7ee6c29c4f492c62d5be90d9f9ed3d1ea77ba98c9462ad5f89a7906f163: Status 404 returned error can't find the container with id 6ff8c7ee6c29c4f492c62d5be90d9f9ed3d1ea77ba98c9462ad5f89a7906f163 Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.804720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" event={"ID":"43bd5d43-af77-493a-a93e-0110cfd5c307","Type":"ContainerStarted","Data":"6ff8c7ee6c29c4f492c62d5be90d9f9ed3d1ea77ba98c9462ad5f89a7906f163"} Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.806307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" event={"ID":"7df72fd9-f705-42a8-b630-88cdf35f8874","Type":"ContainerStarted","Data":"8ecaff36e0f9bb05c7a4b08e09ab16a7fb63812d17b6e5b44e46f3b20479b7a4"} Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.881761 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kqmwg"] Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.883161 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.891176 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqmwg"] Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.965532 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-catalog-content\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.965591 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cspm\" (UniqueName: \"kubernetes.io/projected/01494be9-3195-4273-b3a0-00850b6ab029-kube-api-access-5cspm\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:10 crc kubenswrapper[4743]: I0310 15:22:10.965670 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-utilities\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.067782 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-utilities\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.067925 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-catalog-content\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.068333 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cspm\" (UniqueName: \"kubernetes.io/projected/01494be9-3195-4273-b3a0-00850b6ab029-kube-api-access-5cspm\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.068430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-utilities\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.068753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-catalog-content\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.100650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cspm\" (UniqueName: \"kubernetes.io/projected/01494be9-3195-4273-b3a0-00850b6ab029-kube-api-access-5cspm\") pod \"community-operators-kqmwg\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.210702 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.253035 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.253106 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.705129 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqmwg"] Mar 10 15:22:11 crc kubenswrapper[4743]: I0310 15:22:11.840190 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqmwg" event={"ID":"01494be9-3195-4273-b3a0-00850b6ab029","Type":"ContainerStarted","Data":"2fd6c8a2d960f9679a2005c93b16071e6c5c9af2a0f99c52e4bd5fc2188e92a6"} Mar 10 15:22:12 crc kubenswrapper[4743]: I0310 15:22:12.848695 4743 generic.go:334] "Generic (PLEG): container finished" podID="01494be9-3195-4273-b3a0-00850b6ab029" containerID="322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99" exitCode=0 Mar 10 15:22:12 crc kubenswrapper[4743]: I0310 15:22:12.849125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqmwg" event={"ID":"01494be9-3195-4273-b3a0-00850b6ab029","Type":"ContainerDied","Data":"322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99"} Mar 10 15:22:13 crc kubenswrapper[4743]: I0310 15:22:13.860298 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" event={"ID":"7df72fd9-f705-42a8-b630-88cdf35f8874","Type":"ContainerStarted","Data":"0eba358ce073c8e3b0499d8ffc49387af5e16739f922597921a23c07701838d3"} Mar 10 15:22:13 crc kubenswrapper[4743]: I0310 15:22:13.860856 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:15 crc kubenswrapper[4743]: I0310 15:22:15.880563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" event={"ID":"43bd5d43-af77-493a-a93e-0110cfd5c307","Type":"ContainerStarted","Data":"0deffcf5c0c478d48d878739a137c6027f61be31ddd57f700efb9fc22ca7afa7"} Mar 10 15:22:15 crc kubenswrapper[4743]: I0310 15:22:15.880918 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:15 crc kubenswrapper[4743]: I0310 15:22:15.883521 4743 generic.go:334] "Generic (PLEG): container finished" podID="01494be9-3195-4273-b3a0-00850b6ab029" containerID="ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68" exitCode=0 Mar 10 15:22:15 crc kubenswrapper[4743]: I0310 15:22:15.883587 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqmwg" event={"ID":"01494be9-3195-4273-b3a0-00850b6ab029","Type":"ContainerDied","Data":"ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68"} Mar 10 15:22:15 crc kubenswrapper[4743]: I0310 15:22:15.904371 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" podStartSLOduration=3.312938461 podStartE2EDuration="6.904349622s" podCreationTimestamp="2026-03-10 15:22:09 +0000 UTC" firstStartedPulling="2026-03-10 15:22:09.848996011 +0000 UTC m=+994.555810759" lastFinishedPulling="2026-03-10 15:22:13.440407162 +0000 UTC m=+998.147221920" observedRunningTime="2026-03-10 15:22:13.890200772 +0000 UTC m=+998.597015520" watchObservedRunningTime="2026-03-10 15:22:15.904349622 +0000 UTC m=+1000.611164370" Mar 10 15:22:15 crc kubenswrapper[4743]: I0310 15:22:15.906593 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" podStartSLOduration=1.895524062 podStartE2EDuration="6.906582876s" podCreationTimestamp="2026-03-10 15:22:09 +0000 UTC" firstStartedPulling="2026-03-10 15:22:10.344192192 +0000 UTC m=+995.051006940" lastFinishedPulling="2026-03-10 15:22:15.355251006 +0000 UTC m=+1000.062065754" observedRunningTime="2026-03-10 15:22:15.901919333 +0000 UTC m=+1000.608734081" watchObservedRunningTime="2026-03-10 15:22:15.906582876 +0000 UTC m=+1000.613397624" Mar 10 15:22:17 crc kubenswrapper[4743]: I0310 15:22:17.898205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqmwg" event={"ID":"01494be9-3195-4273-b3a0-00850b6ab029","Type":"ContainerStarted","Data":"4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94"} Mar 10 15:22:17 crc kubenswrapper[4743]: I0310 15:22:17.924908 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kqmwg" podStartSLOduration=4.037459882 podStartE2EDuration="7.924888396s" podCreationTimestamp="2026-03-10 15:22:10 +0000 UTC" firstStartedPulling="2026-03-10 15:22:13.377755076 +0000 UTC m=+998.084569824" lastFinishedPulling="2026-03-10 15:22:17.26518357 +0000 UTC m=+1001.971998338" observedRunningTime="2026-03-10 15:22:17.921312043 +0000 UTC m=+1002.628126791" watchObservedRunningTime="2026-03-10 15:22:17.924888396 +0000 UTC m=+1002.631703144" Mar 10 15:22:21 crc kubenswrapper[4743]: I0310 15:22:21.211577 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:21 crc kubenswrapper[4743]: I0310 15:22:21.212221 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:21 crc kubenswrapper[4743]: I0310 15:22:21.261087 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:29 crc kubenswrapper[4743]: I0310 15:22:29.871412 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66b54d9848-tkr95" Mar 10 15:22:31 crc kubenswrapper[4743]: I0310 15:22:31.256777 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:33 crc kubenswrapper[4743]: I0310 15:22:33.470026 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqmwg"] Mar 10 15:22:33 crc kubenswrapper[4743]: I0310 15:22:33.470521 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kqmwg" podUID="01494be9-3195-4273-b3a0-00850b6ab029" containerName="registry-server" containerID="cri-o://4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94" gracePeriod=2 Mar 10 15:22:33 crc kubenswrapper[4743]: I0310 15:22:33.863641 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.004423 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cspm\" (UniqueName: \"kubernetes.io/projected/01494be9-3195-4273-b3a0-00850b6ab029-kube-api-access-5cspm\") pod \"01494be9-3195-4273-b3a0-00850b6ab029\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.004541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-utilities\") pod \"01494be9-3195-4273-b3a0-00850b6ab029\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.004618 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-catalog-content\") pod \"01494be9-3195-4273-b3a0-00850b6ab029\" (UID: \"01494be9-3195-4273-b3a0-00850b6ab029\") " Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.005659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-utilities" (OuterVolumeSpecName: "utilities") pod "01494be9-3195-4273-b3a0-00850b6ab029" (UID: "01494be9-3195-4273-b3a0-00850b6ab029"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.009514 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01494be9-3195-4273-b3a0-00850b6ab029-kube-api-access-5cspm" (OuterVolumeSpecName: "kube-api-access-5cspm") pod "01494be9-3195-4273-b3a0-00850b6ab029" (UID: "01494be9-3195-4273-b3a0-00850b6ab029"). InnerVolumeSpecName "kube-api-access-5cspm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.043997 4743 generic.go:334] "Generic (PLEG): container finished" podID="01494be9-3195-4273-b3a0-00850b6ab029" containerID="4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94" exitCode=0 Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.044048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqmwg" event={"ID":"01494be9-3195-4273-b3a0-00850b6ab029","Type":"ContainerDied","Data":"4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94"} Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.044080 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqmwg" event={"ID":"01494be9-3195-4273-b3a0-00850b6ab029","Type":"ContainerDied","Data":"2fd6c8a2d960f9679a2005c93b16071e6c5c9af2a0f99c52e4bd5fc2188e92a6"} Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.044101 4743 scope.go:117] "RemoveContainer" containerID="4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.044107 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqmwg" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.092979 4743 scope.go:117] "RemoveContainer" containerID="ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.102430 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01494be9-3195-4273-b3a0-00850b6ab029" (UID: "01494be9-3195-4273-b3a0-00850b6ab029"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.106101 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cspm\" (UniqueName: \"kubernetes.io/projected/01494be9-3195-4273-b3a0-00850b6ab029-kube-api-access-5cspm\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.106133 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.106145 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01494be9-3195-4273-b3a0-00850b6ab029-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.130325 4743 scope.go:117] "RemoveContainer" containerID="322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.152387 4743 scope.go:117] "RemoveContainer" containerID="4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94" Mar 10 15:22:34 crc kubenswrapper[4743]: E0310 15:22:34.152949 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94\": container with ID starting with 4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94 not found: ID does not exist" containerID="4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.152998 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94"} err="failed to get container status \"4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94\": rpc error: code = NotFound desc = could not find container \"4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94\": container with ID starting with 4e3a26c2e6fd5be27af4f8557ab63b20e80e3f4bad28f5356731cc58a02d4c94 not found: ID does not exist" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.153032 4743 scope.go:117] "RemoveContainer" containerID="ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68" Mar 10 15:22:34 crc kubenswrapper[4743]: E0310 15:22:34.153727 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68\": container with ID starting with ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68 not found: ID does not exist" containerID="ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.153756 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68"} err="failed to get container status \"ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68\": rpc error: code = NotFound desc = could not find container \"ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68\": container with ID starting with ea8514c6600c3120494475f128c895e12e9fc38eb93907c62ea6cc99da106c68 not found: ID does not exist" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.153772 4743 scope.go:117] "RemoveContainer" containerID="322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99" Mar 10 15:22:34 crc kubenswrapper[4743]: E0310 15:22:34.154299 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99\": container with ID starting with 322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99 not found: ID does not exist" containerID="322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.154332 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99"} err="failed to get container status \"322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99\": rpc error: code = NotFound desc = could not find container \"322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99\": container with ID starting with 322ba06e2d1f661db381d4adff046aebba2acea094e81648dece3574bf0ead99 not found: ID does not exist" Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.371637 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqmwg"] Mar 10 15:22:34 crc kubenswrapper[4743]: I0310 15:22:34.375424 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kqmwg"] Mar 10 15:22:35 crc kubenswrapper[4743]: I0310 15:22:35.940933 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01494be9-3195-4273-b3a0-00850b6ab029" path="/var/lib/kubelet/pods/01494be9-3195-4273-b3a0-00850b6ab029/volumes" Mar 10 15:22:36 crc kubenswrapper[4743]: I0310 15:22:36.609895 4743 scope.go:117] "RemoveContainer" containerID="f0791db09e5053e2242de874db3e1d91429bdfcf4f6251eacff0ff41ad552725" Mar 10 15:22:41 crc kubenswrapper[4743]: I0310 15:22:41.252460 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:22:41 crc kubenswrapper[4743]: I0310 15:22:41.252572 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:22:49 crc kubenswrapper[4743]: I0310 15:22:49.479202 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85fcd4c69f-jxzhg" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.172248 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb"] Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.172979 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01494be9-3195-4273-b3a0-00850b6ab029" containerName="registry-server" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.173071 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="01494be9-3195-4273-b3a0-00850b6ab029" containerName="registry-server" Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.173135 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01494be9-3195-4273-b3a0-00850b6ab029" containerName="extract-utilities" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.173204 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="01494be9-3195-4273-b3a0-00850b6ab029" containerName="extract-utilities" Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.173275 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01494be9-3195-4273-b3a0-00850b6ab029" containerName="extract-content" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.173335 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="01494be9-3195-4273-b3a0-00850b6ab029" containerName="extract-content" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.173496 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="01494be9-3195-4273-b3a0-00850b6ab029" containerName="registry-server" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.174139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.176192 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gqlkc" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.176740 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.177217 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7lm5r"] Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.181912 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.184243 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.186883 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.196540 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb"] Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6j5j\" (UniqueName: \"kubernetes.io/projected/ec1cebe8-2ebe-4319-8df4-b6616411c83a-kube-api-access-c6j5j\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5hw9\" (UniqueName: \"kubernetes.io/projected/596c245f-700e-4539-a0f9-a9c30906383a-kube-api-access-h5hw9\") pod \"frr-k8s-webhook-server-7f989f654f-c6tlb\" (UID: \"596c245f-700e-4539-a0f9-a9c30906383a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-reloader\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics-certs\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235878 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-sockets\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235891 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-conf\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235911 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/596c245f-700e-4539-a0f9-a9c30906383a-cert\") pod \"frr-k8s-webhook-server-7f989f654f-c6tlb\" (UID: \"596c245f-700e-4539-a0f9-a9c30906383a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.235927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-startup\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.265934 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-fnzjw"] Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.267137 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.269154 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.269156 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-m59gr" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.269556 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.270040 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.291169 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-clgfh"] Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.292348 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.300020 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.314951 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-clgfh"] Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.336860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-sockets\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.336926 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-conf\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.336956 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metallb-excludel2\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.336985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce84f3bd-aef2-4e6e-8d42-829c75efc758-metrics-certs\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/596c245f-700e-4539-a0f9-a9c30906383a-cert\") pod \"frr-k8s-webhook-server-7f989f654f-c6tlb\" (UID: \"596c245f-700e-4539-a0f9-a9c30906383a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337036 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-startup\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6j5j\" (UniqueName: \"kubernetes.io/projected/ec1cebe8-2ebe-4319-8df4-b6616411c83a-kube-api-access-c6j5j\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5hw9\" (UniqueName: \"kubernetes.io/projected/596c245f-700e-4539-a0f9-a9c30906383a-kube-api-access-h5hw9\") pod \"frr-k8s-webhook-server-7f989f654f-c6tlb\" (UID: \"596c245f-700e-4539-a0f9-a9c30906383a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce84f3bd-aef2-4e6e-8d42-829c75efc758-cert\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-reloader\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337454 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics-certs\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337551 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337592 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-sockets\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvcr7\" (UniqueName: \"kubernetes.io/projected/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-kube-api-access-qvcr7\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337641 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwvtq\" (UniqueName: \"kubernetes.io/projected/ce84f3bd-aef2-4e6e-8d42-829c75efc758-kube-api-access-rwvtq\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-conf\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.337789 4743 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.337878 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics-certs podName:ec1cebe8-2ebe-4319-8df4-b6616411c83a nodeName:}" failed. No retries permitted until 2026-03-10 15:22:50.837853645 +0000 UTC m=+1035.544668393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics-certs") pod "frr-k8s-7lm5r" (UID: "ec1cebe8-2ebe-4319-8df4-b6616411c83a") : secret "frr-k8s-certs-secret" not found Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337897 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metrics-certs\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.337929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.338125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ec1cebe8-2ebe-4319-8df4-b6616411c83a-reloader\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.338133 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ec1cebe8-2ebe-4319-8df4-b6616411c83a-frr-startup\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.342495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/596c245f-700e-4539-a0f9-a9c30906383a-cert\") pod \"frr-k8s-webhook-server-7f989f654f-c6tlb\" (UID: \"596c245f-700e-4539-a0f9-a9c30906383a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.357982 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6j5j\" (UniqueName: \"kubernetes.io/projected/ec1cebe8-2ebe-4319-8df4-b6616411c83a-kube-api-access-c6j5j\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.361399 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5hw9\" (UniqueName: \"kubernetes.io/projected/596c245f-700e-4539-a0f9-a9c30906383a-kube-api-access-h5hw9\") pod \"frr-k8s-webhook-server-7f989f654f-c6tlb\" (UID: \"596c245f-700e-4539-a0f9-a9c30906383a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.438884 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvcr7\" (UniqueName: \"kubernetes.io/projected/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-kube-api-access-qvcr7\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.438936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwvtq\" (UniqueName: \"kubernetes.io/projected/ce84f3bd-aef2-4e6e-8d42-829c75efc758-kube-api-access-rwvtq\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.438994 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metrics-certs\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.439029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metallb-excludel2\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.439055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce84f3bd-aef2-4e6e-8d42-829c75efc758-metrics-certs\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.439092 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.439137 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce84f3bd-aef2-4e6e-8d42-829c75efc758-cert\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.439515 4743 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.439589 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metrics-certs podName:0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1 nodeName:}" failed. No retries permitted until 2026-03-10 15:22:50.9395718 +0000 UTC m=+1035.646386548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metrics-certs") pod "speaker-fnzjw" (UID: "0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1") : secret "speaker-certs-secret" not found Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.439594 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.439688 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist podName:0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1 nodeName:}" failed. No retries permitted until 2026-03-10 15:22:50.939667163 +0000 UTC m=+1035.646481901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist") pod "speaker-fnzjw" (UID: "0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1") : secret "metallb-memberlist" not found Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.440395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metallb-excludel2\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.441323 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.443262 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce84f3bd-aef2-4e6e-8d42-829c75efc758-metrics-certs\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.454305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce84f3bd-aef2-4e6e-8d42-829c75efc758-cert\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.454656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvcr7\" (UniqueName: \"kubernetes.io/projected/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-kube-api-access-qvcr7\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.456897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwvtq\" (UniqueName: \"kubernetes.io/projected/ce84f3bd-aef2-4e6e-8d42-829c75efc758-kube-api-access-rwvtq\") pod \"controller-86ddb6bd46-clgfh\" (UID: \"ce84f3bd-aef2-4e6e-8d42-829c75efc758\") " pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.502569 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.608129 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.845995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics-certs\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.850640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec1cebe8-2ebe-4319-8df4-b6616411c83a-metrics-certs\") pod \"frr-k8s-7lm5r\" (UID: \"ec1cebe8-2ebe-4319-8df4-b6616411c83a\") " pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.947993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metrics-certs\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.948140 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.948287 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 15:22:50 crc kubenswrapper[4743]: E0310 15:22:50.948393 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist podName:0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1 nodeName:}" failed. No retries permitted until 2026-03-10 15:22:51.948360441 +0000 UTC m=+1036.655175229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist") pod "speaker-fnzjw" (UID: "0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1") : secret "metallb-memberlist" not found Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.953577 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-metrics-certs\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:50 crc kubenswrapper[4743]: I0310 15:22:50.989237 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb"] Mar 10 15:22:51 crc kubenswrapper[4743]: I0310 15:22:51.040415 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-clgfh"] Mar 10 15:22:51 crc kubenswrapper[4743]: W0310 15:22:51.043868 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce84f3bd_aef2_4e6e_8d42_829c75efc758.slice/crio-24a5864a5ae7bc54e90904c2be04c94b0660beb6d1b69c9ce174c8132ac7e37d WatchSource:0}: Error finding container 24a5864a5ae7bc54e90904c2be04c94b0660beb6d1b69c9ce174c8132ac7e37d: Status 404 returned error can't find the container with id 24a5864a5ae7bc54e90904c2be04c94b0660beb6d1b69c9ce174c8132ac7e37d Mar 10 15:22:51 crc kubenswrapper[4743]: I0310 15:22:51.113909 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:22:51 crc kubenswrapper[4743]: I0310 15:22:51.156742 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" event={"ID":"596c245f-700e-4539-a0f9-a9c30906383a","Type":"ContainerStarted","Data":"e252e003fec32c3fb8d42005ae8dbeaee4232f0ef6e1d894b8eaa7e20516f194"} Mar 10 15:22:51 crc kubenswrapper[4743]: I0310 15:22:51.157584 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-clgfh" event={"ID":"ce84f3bd-aef2-4e6e-8d42-829c75efc758","Type":"ContainerStarted","Data":"24a5864a5ae7bc54e90904c2be04c94b0660beb6d1b69c9ce174c8132ac7e37d"} Mar 10 15:22:51 crc kubenswrapper[4743]: I0310 15:22:51.961965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:51 crc kubenswrapper[4743]: I0310 15:22:51.968448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1-memberlist\") pod \"speaker-fnzjw\" (UID: \"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1\") " pod="metallb-system/speaker-fnzjw" Mar 10 15:22:52 crc kubenswrapper[4743]: I0310 15:22:52.082637 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fnzjw" Mar 10 15:22:52 crc kubenswrapper[4743]: W0310 15:22:52.100897 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fa0eb2e_e4a0_47a2_b91d_113b9c3d68d1.slice/crio-1a18b10176237c520b942deb630564feb28071d1e35e67a877bf51f3a82888ee WatchSource:0}: Error finding container 1a18b10176237c520b942deb630564feb28071d1e35e67a877bf51f3a82888ee: Status 404 returned error can't find the container with id 1a18b10176237c520b942deb630564feb28071d1e35e67a877bf51f3a82888ee Mar 10 15:22:52 crc kubenswrapper[4743]: I0310 15:22:52.184542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-clgfh" event={"ID":"ce84f3bd-aef2-4e6e-8d42-829c75efc758","Type":"ContainerStarted","Data":"ad55e3c67ffa858d56f83ceddc7d8a43bf7365e83fc5936ca6990f02487b29b8"} Mar 10 15:22:52 crc kubenswrapper[4743]: I0310 15:22:52.184589 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-clgfh" event={"ID":"ce84f3bd-aef2-4e6e-8d42-829c75efc758","Type":"ContainerStarted","Data":"7e6bd337651210bf20fccddc49116175a67445e004191d2d88fff3f01d10e7c4"} Mar 10 15:22:52 crc kubenswrapper[4743]: I0310 15:22:52.185874 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerStarted","Data":"a8cc6db5d80be72cb05545e253c61237d93f2a2433f2000242c13b2cbb2fe8eb"} Mar 10 15:22:52 crc kubenswrapper[4743]: I0310 15:22:52.187078 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fnzjw" event={"ID":"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1","Type":"ContainerStarted","Data":"1a18b10176237c520b942deb630564feb28071d1e35e67a877bf51f3a82888ee"} Mar 10 15:22:52 crc kubenswrapper[4743]: I0310 15:22:52.206922 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-clgfh" podStartSLOduration=2.206898967 podStartE2EDuration="2.206898967s" podCreationTimestamp="2026-03-10 15:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:22:52.201481862 +0000 UTC m=+1036.908296630" watchObservedRunningTime="2026-03-10 15:22:52.206898967 +0000 UTC m=+1036.913713725" Mar 10 15:22:53 crc kubenswrapper[4743]: I0310 15:22:53.203143 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fnzjw" event={"ID":"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1","Type":"ContainerStarted","Data":"4b2794bf9eb207f92e51fb9c817ecfe86c334c7b400ac932e38f798b56ec0aeb"} Mar 10 15:22:53 crc kubenswrapper[4743]: I0310 15:22:53.203519 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fnzjw" event={"ID":"0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1","Type":"ContainerStarted","Data":"cdb0bf0730e6b168e7a5d41450764774cc9b93415a0b639561a43cc2fe4ca056"} Mar 10 15:22:53 crc kubenswrapper[4743]: I0310 15:22:53.203543 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:22:53 crc kubenswrapper[4743]: I0310 15:22:53.226166 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-fnzjw" podStartSLOduration=3.226151836 podStartE2EDuration="3.226151836s" podCreationTimestamp="2026-03-10 15:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:22:53.222965325 +0000 UTC m=+1037.929780073" watchObservedRunningTime="2026-03-10 15:22:53.226151836 +0000 UTC m=+1037.932966584" Mar 10 15:22:54 crc kubenswrapper[4743]: I0310 15:22:54.211759 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-fnzjw" Mar 10 15:22:59 crc kubenswrapper[4743]: I0310 15:22:59.257066 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec1cebe8-2ebe-4319-8df4-b6616411c83a" containerID="90080037bca0adc419e739df5aa96952cc0a596ef221a6d8a5e3ca96254198a2" exitCode=0 Mar 10 15:22:59 crc kubenswrapper[4743]: I0310 15:22:59.257154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerDied","Data":"90080037bca0adc419e739df5aa96952cc0a596ef221a6d8a5e3ca96254198a2"} Mar 10 15:22:59 crc kubenswrapper[4743]: I0310 15:22:59.261159 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" event={"ID":"596c245f-700e-4539-a0f9-a9c30906383a","Type":"ContainerStarted","Data":"2930801adfbc84d25275d45465e4c80037a140a38f57e1be2982377a3623a6f9"} Mar 10 15:22:59 crc kubenswrapper[4743]: I0310 15:22:59.261321 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:22:59 crc kubenswrapper[4743]: I0310 15:22:59.334436 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" podStartSLOduration=1.767182417 podStartE2EDuration="9.334416183s" podCreationTimestamp="2026-03-10 15:22:50 +0000 UTC" firstStartedPulling="2026-03-10 15:22:50.999354453 +0000 UTC m=+1035.706169201" lastFinishedPulling="2026-03-10 15:22:58.566588229 +0000 UTC m=+1043.273402967" observedRunningTime="2026-03-10 15:22:59.32139834 +0000 UTC m=+1044.028213088" watchObservedRunningTime="2026-03-10 15:22:59.334416183 +0000 UTC m=+1044.041230931" Mar 10 15:23:00 crc kubenswrapper[4743]: I0310 15:23:00.269914 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec1cebe8-2ebe-4319-8df4-b6616411c83a" containerID="6fbb4fef0b1d5bf8fbc939c71c2eb2d434b5fd5ffe1b399d1eb2d93e66bc3bba" exitCode=0 Mar 10 15:23:00 crc kubenswrapper[4743]: I0310 15:23:00.269986 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerDied","Data":"6fbb4fef0b1d5bf8fbc939c71c2eb2d434b5fd5ffe1b399d1eb2d93e66bc3bba"} Mar 10 15:23:01 crc kubenswrapper[4743]: I0310 15:23:01.284149 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec1cebe8-2ebe-4319-8df4-b6616411c83a" containerID="c8f6aed6bf25deabb8a5ac7d09364de490a7dd1988d68e9e042d9ddf35d3cfa1" exitCode=0 Mar 10 15:23:01 crc kubenswrapper[4743]: I0310 15:23:01.284241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerDied","Data":"c8f6aed6bf25deabb8a5ac7d09364de490a7dd1988d68e9e042d9ddf35d3cfa1"} Mar 10 15:23:02 crc kubenswrapper[4743]: I0310 15:23:02.086887 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-fnzjw" Mar 10 15:23:02 crc kubenswrapper[4743]: I0310 15:23:02.297462 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerStarted","Data":"5f2fd16a1b8f4d7673b59b7bc0df4032ad42bda97c81d756ee9d073ae5876a58"} Mar 10 15:23:02 crc kubenswrapper[4743]: I0310 15:23:02.297510 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerStarted","Data":"b486d88d8fe80d6d10a8e1344aded2a207749a69643c2757b6c3fbd63baeef4e"} Mar 10 15:23:02 crc kubenswrapper[4743]: I0310 15:23:02.297522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerStarted","Data":"1e07e80f2f3ecf84a5cde2e32f3ab16eb2c7fbcd73204657d32fdd19ba47bdcc"} Mar 10 15:23:02 crc kubenswrapper[4743]: I0310 15:23:02.297532 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerStarted","Data":"a4521c87849d286a682e4059361243bb7a0296f7b1e2ebac0357b35e083a3e2e"} Mar 10 15:23:02 crc kubenswrapper[4743]: I0310 15:23:02.297542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerStarted","Data":"514418d60add69bc3c2a3555d78be20dfc8532c2aa9a9925c002732c9584410a"} Mar 10 15:23:03 crc kubenswrapper[4743]: I0310 15:23:03.312227 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7lm5r" event={"ID":"ec1cebe8-2ebe-4319-8df4-b6616411c83a","Type":"ContainerStarted","Data":"b3f6888ee4304c7d1e5f7db41bca8d4cc1ed4e8c7a34c3b37061efa3b482d1f5"} Mar 10 15:23:03 crc kubenswrapper[4743]: I0310 15:23:03.312872 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:23:03 crc kubenswrapper[4743]: I0310 15:23:03.352065 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7lm5r" podStartSLOduration=6.024655843 podStartE2EDuration="13.352038607s" podCreationTimestamp="2026-03-10 15:22:50 +0000 UTC" firstStartedPulling="2026-03-10 15:22:51.233172693 +0000 UTC m=+1035.939987441" lastFinishedPulling="2026-03-10 15:22:58.560555457 +0000 UTC m=+1043.267370205" observedRunningTime="2026-03-10 15:23:03.347605 +0000 UTC m=+1048.054419758" watchObservedRunningTime="2026-03-10 15:23:03.352038607 +0000 UTC m=+1048.058853395" Mar 10 15:23:04 crc kubenswrapper[4743]: I0310 15:23:04.586779 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lbt9g"] Mar 10 15:23:04 crc kubenswrapper[4743]: I0310 15:23:04.587987 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lbt9g" Mar 10 15:23:04 crc kubenswrapper[4743]: W0310 15:23:04.590000 4743 reflector.go:561] object-"openstack-operators"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 10 15:23:04 crc kubenswrapper[4743]: E0310 15:23:04.590074 4743 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:23:04 crc kubenswrapper[4743]: W0310 15:23:04.590385 4743 reflector.go:561] object-"openstack-operators"/"openstack-operator-index-dockercfg-ks2gt": failed to list *v1.Secret: secrets "openstack-operator-index-dockercfg-ks2gt" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 10 15:23:04 crc kubenswrapper[4743]: E0310 15:23:04.590439 4743 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"openstack-operator-index-dockercfg-ks2gt\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-operator-index-dockercfg-ks2gt\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:23:04 crc kubenswrapper[4743]: I0310 15:23:04.591666 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 15:23:04 crc kubenswrapper[4743]: I0310 15:23:04.605718 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lbt9g"] Mar 10 15:23:04 crc kubenswrapper[4743]: I0310 15:23:04.784608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfzcf\" (UniqueName: \"kubernetes.io/projected/d74626c8-e38b-42b3-ab11-bb800292f1d0-kube-api-access-zfzcf\") pod \"openstack-operator-index-lbt9g\" (UID: \"d74626c8-e38b-42b3-ab11-bb800292f1d0\") " pod="openstack-operators/openstack-operator-index-lbt9g" Mar 10 15:23:04 crc kubenswrapper[4743]: I0310 15:23:04.885574 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfzcf\" (UniqueName: \"kubernetes.io/projected/d74626c8-e38b-42b3-ab11-bb800292f1d0-kube-api-access-zfzcf\") pod \"openstack-operator-index-lbt9g\" (UID: \"d74626c8-e38b-42b3-ab11-bb800292f1d0\") " pod="openstack-operators/openstack-operator-index-lbt9g" Mar 10 15:23:05 crc kubenswrapper[4743]: I0310 15:23:05.732458 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 15:23:05 crc kubenswrapper[4743]: I0310 15:23:05.744437 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfzcf\" (UniqueName: \"kubernetes.io/projected/d74626c8-e38b-42b3-ab11-bb800292f1d0-kube-api-access-zfzcf\") pod \"openstack-operator-index-lbt9g\" (UID: \"d74626c8-e38b-42b3-ab11-bb800292f1d0\") " pod="openstack-operators/openstack-operator-index-lbt9g" Mar 10 15:23:06 crc kubenswrapper[4743]: I0310 15:23:06.114348 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:23:06 crc kubenswrapper[4743]: I0310 15:23:06.136981 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-ks2gt" Mar 10 15:23:06 crc kubenswrapper[4743]: I0310 15:23:06.142625 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lbt9g" Mar 10 15:23:06 crc kubenswrapper[4743]: I0310 15:23:06.165539 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:23:06 crc kubenswrapper[4743]: I0310 15:23:06.704926 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lbt9g"] Mar 10 15:23:06 crc kubenswrapper[4743]: W0310 15:23:06.713041 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74626c8_e38b_42b3_ab11_bb800292f1d0.slice/crio-9f30ac4da32507d6d3429cbf0a140fcdd6486d708e9821c09ef4b9723e6571eb WatchSource:0}: Error finding container 9f30ac4da32507d6d3429cbf0a140fcdd6486d708e9821c09ef4b9723e6571eb: Status 404 returned error can't find the container with id 9f30ac4da32507d6d3429cbf0a140fcdd6486d708e9821c09ef4b9723e6571eb Mar 10 15:23:07 crc kubenswrapper[4743]: I0310 15:23:07.360798 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lbt9g" event={"ID":"d74626c8-e38b-42b3-ab11-bb800292f1d0","Type":"ContainerStarted","Data":"9f30ac4da32507d6d3429cbf0a140fcdd6486d708e9821c09ef4b9723e6571eb"} Mar 10 15:23:07 crc kubenswrapper[4743]: I0310 15:23:07.975306 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lbt9g"] Mar 10 15:23:08 crc kubenswrapper[4743]: I0310 15:23:08.569640 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-pn4jm"] Mar 10 15:23:08 crc kubenswrapper[4743]: I0310 15:23:08.573112 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:08 crc kubenswrapper[4743]: I0310 15:23:08.578098 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pn4jm"] Mar 10 15:23:08 crc kubenswrapper[4743]: I0310 15:23:08.653030 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9988x\" (UniqueName: \"kubernetes.io/projected/bdb5f935-4758-4d10-8e11-1e884efabce6-kube-api-access-9988x\") pod \"openstack-operator-index-pn4jm\" (UID: \"bdb5f935-4758-4d10-8e11-1e884efabce6\") " pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:08 crc kubenswrapper[4743]: I0310 15:23:08.755054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9988x\" (UniqueName: \"kubernetes.io/projected/bdb5f935-4758-4d10-8e11-1e884efabce6-kube-api-access-9988x\") pod \"openstack-operator-index-pn4jm\" (UID: \"bdb5f935-4758-4d10-8e11-1e884efabce6\") " pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:08 crc kubenswrapper[4743]: I0310 15:23:08.789473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9988x\" (UniqueName: \"kubernetes.io/projected/bdb5f935-4758-4d10-8e11-1e884efabce6-kube-api-access-9988x\") pod \"openstack-operator-index-pn4jm\" (UID: \"bdb5f935-4758-4d10-8e11-1e884efabce6\") " pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:08 crc kubenswrapper[4743]: I0310 15:23:08.904663 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:09 crc kubenswrapper[4743]: I0310 15:23:09.375415 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lbt9g" event={"ID":"d74626c8-e38b-42b3-ab11-bb800292f1d0","Type":"ContainerStarted","Data":"780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1"} Mar 10 15:23:09 crc kubenswrapper[4743]: I0310 15:23:09.375598 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-lbt9g" podUID="d74626c8-e38b-42b3-ab11-bb800292f1d0" containerName="registry-server" containerID="cri-o://780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1" gracePeriod=2 Mar 10 15:23:09 crc kubenswrapper[4743]: I0310 15:23:09.392419 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pn4jm"] Mar 10 15:23:09 crc kubenswrapper[4743]: I0310 15:23:09.400426 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lbt9g" podStartSLOduration=3.132216177 podStartE2EDuration="5.400396528s" podCreationTimestamp="2026-03-10 15:23:04 +0000 UTC" firstStartedPulling="2026-03-10 15:23:06.714702192 +0000 UTC m=+1051.421516940" lastFinishedPulling="2026-03-10 15:23:08.982882543 +0000 UTC m=+1053.689697291" observedRunningTime="2026-03-10 15:23:09.394853329 +0000 UTC m=+1054.101668077" watchObservedRunningTime="2026-03-10 15:23:09.400396528 +0000 UTC m=+1054.107211316" Mar 10 15:23:09 crc kubenswrapper[4743]: W0310 15:23:09.402158 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb5f935_4758_4d10_8e11_1e884efabce6.slice/crio-4ec0f89c30e002a1a348ca0e93e008d4985644d113c9b9ec3d0de6fd12a955a1 WatchSource:0}: Error finding container 4ec0f89c30e002a1a348ca0e93e008d4985644d113c9b9ec3d0de6fd12a955a1: Status 404 returned error can't find the container with id 4ec0f89c30e002a1a348ca0e93e008d4985644d113c9b9ec3d0de6fd12a955a1 Mar 10 15:23:09 crc kubenswrapper[4743]: I0310 15:23:09.722036 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lbt9g" Mar 10 15:23:09 crc kubenswrapper[4743]: I0310 15:23:09.868602 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfzcf\" (UniqueName: \"kubernetes.io/projected/d74626c8-e38b-42b3-ab11-bb800292f1d0-kube-api-access-zfzcf\") pod \"d74626c8-e38b-42b3-ab11-bb800292f1d0\" (UID: \"d74626c8-e38b-42b3-ab11-bb800292f1d0\") " Mar 10 15:23:09 crc kubenswrapper[4743]: I0310 15:23:09.874178 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74626c8-e38b-42b3-ab11-bb800292f1d0-kube-api-access-zfzcf" (OuterVolumeSpecName: "kube-api-access-zfzcf") pod "d74626c8-e38b-42b3-ab11-bb800292f1d0" (UID: "d74626c8-e38b-42b3-ab11-bb800292f1d0"). InnerVolumeSpecName "kube-api-access-zfzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:23:09 crc kubenswrapper[4743]: I0310 15:23:09.969955 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfzcf\" (UniqueName: \"kubernetes.io/projected/d74626c8-e38b-42b3-ab11-bb800292f1d0-kube-api-access-zfzcf\") on node \"crc\" DevicePath \"\"" Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.383596 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pn4jm" event={"ID":"bdb5f935-4758-4d10-8e11-1e884efabce6","Type":"ContainerStarted","Data":"e3123160bab27ab81ea9967d52a96a21ef4649c405f429c32779945374dea9b9"} Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.383662 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pn4jm" event={"ID":"bdb5f935-4758-4d10-8e11-1e884efabce6","Type":"ContainerStarted","Data":"4ec0f89c30e002a1a348ca0e93e008d4985644d113c9b9ec3d0de6fd12a955a1"} Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.385192 4743 generic.go:334] "Generic (PLEG): container finished" podID="d74626c8-e38b-42b3-ab11-bb800292f1d0" containerID="780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1" exitCode=0 Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.385250 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lbt9g" Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.385374 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lbt9g" event={"ID":"d74626c8-e38b-42b3-ab11-bb800292f1d0","Type":"ContainerDied","Data":"780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1"} Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.385495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lbt9g" event={"ID":"d74626c8-e38b-42b3-ab11-bb800292f1d0","Type":"ContainerDied","Data":"9f30ac4da32507d6d3429cbf0a140fcdd6486d708e9821c09ef4b9723e6571eb"} Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.385599 4743 scope.go:117] "RemoveContainer" containerID="780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1" Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.409755 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-pn4jm" podStartSLOduration=2.358604817 podStartE2EDuration="2.409680001s" podCreationTimestamp="2026-03-10 15:23:08 +0000 UTC" firstStartedPulling="2026-03-10 15:23:09.408515591 +0000 UTC m=+1054.115330339" lastFinishedPulling="2026-03-10 15:23:09.459590775 +0000 UTC m=+1054.166405523" observedRunningTime="2026-03-10 15:23:10.402385732 +0000 UTC m=+1055.109200510" watchObservedRunningTime="2026-03-10 15:23:10.409680001 +0000 UTC m=+1055.116494829" Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.417065 4743 scope.go:117] "RemoveContainer" containerID="780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1" Mar 10 15:23:10 crc kubenswrapper[4743]: E0310 15:23:10.418276 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1\": container with ID starting with 780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1 not found: ID does not exist" containerID="780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1" Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.418334 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1"} err="failed to get container status \"780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1\": rpc error: code = NotFound desc = could not find container \"780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1\": container with ID starting with 780796841dd186a2f25177a959bb4aa6232bffffccbac6a46f5d0d0c7d13e2d1 not found: ID does not exist" Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.428780 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lbt9g"] Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.433725 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-lbt9g"] Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.507271 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-c6tlb" Mar 10 15:23:10 crc kubenswrapper[4743]: I0310 15:23:10.613200 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-clgfh" Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.117082 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7lm5r" Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.253269 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.253337 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.253378 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.254025 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f3c2bb9de0715122300a23c69b01248c82e5f8f70b4cad7c24ef747b724b4d8"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.254091 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://9f3c2bb9de0715122300a23c69b01248c82e5f8f70b4cad7c24ef747b724b4d8" gracePeriod=600 Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.397036 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="9f3c2bb9de0715122300a23c69b01248c82e5f8f70b4cad7c24ef747b724b4d8" exitCode=0 Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.397273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"9f3c2bb9de0715122300a23c69b01248c82e5f8f70b4cad7c24ef747b724b4d8"} Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.397477 4743 scope.go:117] "RemoveContainer" containerID="97d590bdb55b8d3eeaf67e203c3704815de1593197109afcafe553653c1d6c9f" Mar 10 15:23:11 crc kubenswrapper[4743]: I0310 15:23:11.924699 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74626c8-e38b-42b3-ab11-bb800292f1d0" path="/var/lib/kubelet/pods/d74626c8-e38b-42b3-ab11-bb800292f1d0/volumes" Mar 10 15:23:12 crc kubenswrapper[4743]: I0310 15:23:12.408224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"e87aabce4b79954ca871a938cf484c77623556f05115d13359a5bd9f0c4154c7"} Mar 10 15:23:18 crc kubenswrapper[4743]: I0310 15:23:18.906178 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:18 crc kubenswrapper[4743]: I0310 15:23:18.907256 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:18 crc kubenswrapper[4743]: I0310 15:23:18.952160 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:19 crc kubenswrapper[4743]: I0310 15:23:19.496480 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-pn4jm" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.032780 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz"] Mar 10 15:23:25 crc kubenswrapper[4743]: E0310 15:23:25.033897 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74626c8-e38b-42b3-ab11-bb800292f1d0" containerName="registry-server" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.033918 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74626c8-e38b-42b3-ab11-bb800292f1d0" containerName="registry-server" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.034097 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74626c8-e38b-42b3-ab11-bb800292f1d0" containerName="registry-server" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.035423 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.039302 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-flg4z" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.049694 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz"] Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.195554 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph2cr\" (UniqueName: \"kubernetes.io/projected/412e2285-16ae-49ca-a514-ee3f1297bfd4-kube-api-access-ph2cr\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.195689 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-bundle\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.195729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-util\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.296919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-bundle\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.297018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-util\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.297082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph2cr\" (UniqueName: \"kubernetes.io/projected/412e2285-16ae-49ca-a514-ee3f1297bfd4-kube-api-access-ph2cr\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.297908 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-bundle\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.297947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-util\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.318838 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph2cr\" (UniqueName: \"kubernetes.io/projected/412e2285-16ae-49ca-a514-ee3f1297bfd4-kube-api-access-ph2cr\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.380097 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:25 crc kubenswrapper[4743]: I0310 15:23:25.799441 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz"] Mar 10 15:23:26 crc kubenswrapper[4743]: I0310 15:23:26.514070 4743 generic.go:334] "Generic (PLEG): container finished" podID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerID="7dc3f6cdcfbe6251fa3034ded1aaa6dea32d2a82b1a4537724530f4ef05ba2ee" exitCode=0 Mar 10 15:23:26 crc kubenswrapper[4743]: I0310 15:23:26.514143 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" event={"ID":"412e2285-16ae-49ca-a514-ee3f1297bfd4","Type":"ContainerDied","Data":"7dc3f6cdcfbe6251fa3034ded1aaa6dea32d2a82b1a4537724530f4ef05ba2ee"} Mar 10 15:23:26 crc kubenswrapper[4743]: I0310 15:23:26.514202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" event={"ID":"412e2285-16ae-49ca-a514-ee3f1297bfd4","Type":"ContainerStarted","Data":"8b2c300e75bd8debb0b9ee38c4901027601663eae45c1487cf43020ae63d75dd"} Mar 10 15:23:27 crc kubenswrapper[4743]: I0310 15:23:27.525738 4743 generic.go:334] "Generic (PLEG): container finished" podID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerID="241173dc2452f43c67d79491d106d275384313bd63eb93f2065b776ae85f1814" exitCode=0 Mar 10 15:23:27 crc kubenswrapper[4743]: I0310 15:23:27.526224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" event={"ID":"412e2285-16ae-49ca-a514-ee3f1297bfd4","Type":"ContainerDied","Data":"241173dc2452f43c67d79491d106d275384313bd63eb93f2065b776ae85f1814"} Mar 10 15:23:28 crc kubenswrapper[4743]: I0310 15:23:28.535127 4743 generic.go:334] "Generic (PLEG): container finished" podID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerID="3b2c43158778a5408a094e68e6814b5218613318824d6f90e096b3d88da95bf6" exitCode=0 Mar 10 15:23:28 crc kubenswrapper[4743]: I0310 15:23:28.535184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" event={"ID":"412e2285-16ae-49ca-a514-ee3f1297bfd4","Type":"ContainerDied","Data":"3b2c43158778a5408a094e68e6814b5218613318824d6f90e096b3d88da95bf6"} Mar 10 15:23:29 crc kubenswrapper[4743]: I0310 15:23:29.822555 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:29 crc kubenswrapper[4743]: I0310 15:23:29.969079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-util\") pod \"412e2285-16ae-49ca-a514-ee3f1297bfd4\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " Mar 10 15:23:29 crc kubenswrapper[4743]: I0310 15:23:29.969247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-bundle\") pod \"412e2285-16ae-49ca-a514-ee3f1297bfd4\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " Mar 10 15:23:29 crc kubenswrapper[4743]: I0310 15:23:29.969292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph2cr\" (UniqueName: \"kubernetes.io/projected/412e2285-16ae-49ca-a514-ee3f1297bfd4-kube-api-access-ph2cr\") pod \"412e2285-16ae-49ca-a514-ee3f1297bfd4\" (UID: \"412e2285-16ae-49ca-a514-ee3f1297bfd4\") " Mar 10 15:23:29 crc kubenswrapper[4743]: I0310 15:23:29.970522 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-bundle" (OuterVolumeSpecName: "bundle") pod "412e2285-16ae-49ca-a514-ee3f1297bfd4" (UID: "412e2285-16ae-49ca-a514-ee3f1297bfd4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:23:29 crc kubenswrapper[4743]: I0310 15:23:29.977487 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412e2285-16ae-49ca-a514-ee3f1297bfd4-kube-api-access-ph2cr" (OuterVolumeSpecName: "kube-api-access-ph2cr") pod "412e2285-16ae-49ca-a514-ee3f1297bfd4" (UID: "412e2285-16ae-49ca-a514-ee3f1297bfd4"). InnerVolumeSpecName "kube-api-access-ph2cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:23:29 crc kubenswrapper[4743]: I0310 15:23:29.983653 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-util" (OuterVolumeSpecName: "util") pod "412e2285-16ae-49ca-a514-ee3f1297bfd4" (UID: "412e2285-16ae-49ca-a514-ee3f1297bfd4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:23:30 crc kubenswrapper[4743]: I0310 15:23:30.070742 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:23:30 crc kubenswrapper[4743]: I0310 15:23:30.070789 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph2cr\" (UniqueName: \"kubernetes.io/projected/412e2285-16ae-49ca-a514-ee3f1297bfd4-kube-api-access-ph2cr\") on node \"crc\" DevicePath \"\"" Mar 10 15:23:30 crc kubenswrapper[4743]: I0310 15:23:30.070802 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/412e2285-16ae-49ca-a514-ee3f1297bfd4-util\") on node \"crc\" DevicePath \"\"" Mar 10 15:23:30 crc kubenswrapper[4743]: I0310 15:23:30.551997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" event={"ID":"412e2285-16ae-49ca-a514-ee3f1297bfd4","Type":"ContainerDied","Data":"8b2c300e75bd8debb0b9ee38c4901027601663eae45c1487cf43020ae63d75dd"} Mar 10 15:23:30 crc kubenswrapper[4743]: I0310 15:23:30.552078 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b2c300e75bd8debb0b9ee38c4901027601663eae45c1487cf43020ae63d75dd" Mar 10 15:23:30 crc kubenswrapper[4743]: I0310 15:23:30.552206 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.078477 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt"] Mar 10 15:23:37 crc kubenswrapper[4743]: E0310 15:23:37.079446 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerName="extract" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.079461 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerName="extract" Mar 10 15:23:37 crc kubenswrapper[4743]: E0310 15:23:37.079479 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerName="util" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.079490 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerName="util" Mar 10 15:23:37 crc kubenswrapper[4743]: E0310 15:23:37.079520 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerName="pull" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.079531 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerName="pull" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.079678 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="412e2285-16ae-49ca-a514-ee3f1297bfd4" containerName="extract" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.080218 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.096863 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-79b42" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.133357 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt"] Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.182092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbbt\" (UniqueName: \"kubernetes.io/projected/4cd2825d-785d-46c7-8d95-0237100bd129-kube-api-access-6vbbt\") pod \"openstack-operator-controller-init-7c7f7d994-spvkt\" (UID: \"4cd2825d-785d-46c7-8d95-0237100bd129\") " pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.284032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbbt\" (UniqueName: \"kubernetes.io/projected/4cd2825d-785d-46c7-8d95-0237100bd129-kube-api-access-6vbbt\") pod \"openstack-operator-controller-init-7c7f7d994-spvkt\" (UID: \"4cd2825d-785d-46c7-8d95-0237100bd129\") " pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.314756 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbbt\" (UniqueName: \"kubernetes.io/projected/4cd2825d-785d-46c7-8d95-0237100bd129-kube-api-access-6vbbt\") pod \"openstack-operator-controller-init-7c7f7d994-spvkt\" (UID: \"4cd2825d-785d-46c7-8d95-0237100bd129\") " pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.425569 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" Mar 10 15:23:37 crc kubenswrapper[4743]: I0310 15:23:37.845217 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt"] Mar 10 15:23:37 crc kubenswrapper[4743]: W0310 15:23:37.853967 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cd2825d_785d_46c7_8d95_0237100bd129.slice/crio-f089626aa263e3965bd6e90b6b28477f357c5a895419dad538e348ebbf7c0578 WatchSource:0}: Error finding container f089626aa263e3965bd6e90b6b28477f357c5a895419dad538e348ebbf7c0578: Status 404 returned error can't find the container with id f089626aa263e3965bd6e90b6b28477f357c5a895419dad538e348ebbf7c0578 Mar 10 15:23:38 crc kubenswrapper[4743]: I0310 15:23:38.616272 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" event={"ID":"4cd2825d-785d-46c7-8d95-0237100bd129","Type":"ContainerStarted","Data":"f089626aa263e3965bd6e90b6b28477f357c5a895419dad538e348ebbf7c0578"} Mar 10 15:23:42 crc kubenswrapper[4743]: I0310 15:23:42.652052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" event={"ID":"4cd2825d-785d-46c7-8d95-0237100bd129","Type":"ContainerStarted","Data":"307ea82e63766c40b01574b16c0e60dd15a85c10ae332cc8dae425b0d0b027a4"} Mar 10 15:23:42 crc kubenswrapper[4743]: I0310 15:23:42.652717 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" Mar 10 15:23:42 crc kubenswrapper[4743]: I0310 15:23:42.696610 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" podStartSLOduration=1.982883695 podStartE2EDuration="5.69659027s" podCreationTimestamp="2026-03-10 15:23:37 +0000 UTC" firstStartedPulling="2026-03-10 15:23:37.859547454 +0000 UTC m=+1082.566362202" lastFinishedPulling="2026-03-10 15:23:41.573254029 +0000 UTC m=+1086.280068777" observedRunningTime="2026-03-10 15:23:42.69170061 +0000 UTC m=+1087.398515448" watchObservedRunningTime="2026-03-10 15:23:42.69659027 +0000 UTC m=+1087.403405028" Mar 10 15:23:47 crc kubenswrapper[4743]: I0310 15:23:47.430078 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-spvkt" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.142019 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552604-lrmxz"] Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.144251 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552604-lrmxz" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.147431 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.147856 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.147856 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.149401 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552604-lrmxz"] Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.185006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v866t\" (UniqueName: \"kubernetes.io/projected/e1a6bc3c-7fe6-4c98-b45a-01296d21caf4-kube-api-access-v866t\") pod \"auto-csr-approver-29552604-lrmxz\" (UID: \"e1a6bc3c-7fe6-4c98-b45a-01296d21caf4\") " pod="openshift-infra/auto-csr-approver-29552604-lrmxz" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.287312 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v866t\" (UniqueName: \"kubernetes.io/projected/e1a6bc3c-7fe6-4c98-b45a-01296d21caf4-kube-api-access-v866t\") pod \"auto-csr-approver-29552604-lrmxz\" (UID: \"e1a6bc3c-7fe6-4c98-b45a-01296d21caf4\") " pod="openshift-infra/auto-csr-approver-29552604-lrmxz" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.317062 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v866t\" (UniqueName: \"kubernetes.io/projected/e1a6bc3c-7fe6-4c98-b45a-01296d21caf4-kube-api-access-v866t\") pod \"auto-csr-approver-29552604-lrmxz\" (UID: \"e1a6bc3c-7fe6-4c98-b45a-01296d21caf4\") " pod="openshift-infra/auto-csr-approver-29552604-lrmxz" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.496880 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552604-lrmxz" Mar 10 15:24:00 crc kubenswrapper[4743]: I0310 15:24:00.892741 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552604-lrmxz"] Mar 10 15:24:01 crc kubenswrapper[4743]: I0310 15:24:01.782208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552604-lrmxz" event={"ID":"e1a6bc3c-7fe6-4c98-b45a-01296d21caf4","Type":"ContainerStarted","Data":"810345544f325c344d8084c207b45743ffbfe42576fce1add98ad0e990c01de9"} Mar 10 15:24:03 crc kubenswrapper[4743]: I0310 15:24:03.796725 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1a6bc3c-7fe6-4c98-b45a-01296d21caf4" containerID="bc6875416e2bf56649ef481dec269bb9679823c362f822b0f47a0d00adee58a8" exitCode=0 Mar 10 15:24:03 crc kubenswrapper[4743]: I0310 15:24:03.796797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552604-lrmxz" event={"ID":"e1a6bc3c-7fe6-4c98-b45a-01296d21caf4","Type":"ContainerDied","Data":"bc6875416e2bf56649ef481dec269bb9679823c362f822b0f47a0d00adee58a8"} Mar 10 15:24:05 crc kubenswrapper[4743]: I0310 15:24:05.088660 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552604-lrmxz" Mar 10 15:24:05 crc kubenswrapper[4743]: I0310 15:24:05.276758 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v866t\" (UniqueName: \"kubernetes.io/projected/e1a6bc3c-7fe6-4c98-b45a-01296d21caf4-kube-api-access-v866t\") pod \"e1a6bc3c-7fe6-4c98-b45a-01296d21caf4\" (UID: \"e1a6bc3c-7fe6-4c98-b45a-01296d21caf4\") " Mar 10 15:24:05 crc kubenswrapper[4743]: I0310 15:24:05.285095 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a6bc3c-7fe6-4c98-b45a-01296d21caf4-kube-api-access-v866t" (OuterVolumeSpecName: "kube-api-access-v866t") pod "e1a6bc3c-7fe6-4c98-b45a-01296d21caf4" (UID: "e1a6bc3c-7fe6-4c98-b45a-01296d21caf4"). InnerVolumeSpecName "kube-api-access-v866t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:24:05 crc kubenswrapper[4743]: I0310 15:24:05.378125 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v866t\" (UniqueName: \"kubernetes.io/projected/e1a6bc3c-7fe6-4c98-b45a-01296d21caf4-kube-api-access-v866t\") on node \"crc\" DevicePath \"\"" Mar 10 15:24:05 crc kubenswrapper[4743]: I0310 15:24:05.810840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552604-lrmxz" event={"ID":"e1a6bc3c-7fe6-4c98-b45a-01296d21caf4","Type":"ContainerDied","Data":"810345544f325c344d8084c207b45743ffbfe42576fce1add98ad0e990c01de9"} Mar 10 15:24:05 crc kubenswrapper[4743]: I0310 15:24:05.811277 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="810345544f325c344d8084c207b45743ffbfe42576fce1add98ad0e990c01de9" Mar 10 15:24:05 crc kubenswrapper[4743]: I0310 15:24:05.810990 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552604-lrmxz" Mar 10 15:24:06 crc kubenswrapper[4743]: I0310 15:24:06.173429 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-65srp"] Mar 10 15:24:06 crc kubenswrapper[4743]: I0310 15:24:06.177377 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-65srp"] Mar 10 15:24:07 crc kubenswrapper[4743]: I0310 15:24:07.923021 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4" path="/var/lib/kubelet/pods/ca2145ae-aba7-4b37-85ad-64d1c3c2a8c4/volumes" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.732780 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz"] Mar 10 15:24:08 crc kubenswrapper[4743]: E0310 15:24:08.733095 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a6bc3c-7fe6-4c98-b45a-01296d21caf4" containerName="oc" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.733108 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a6bc3c-7fe6-4c98-b45a-01296d21caf4" containerName="oc" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.733226 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a6bc3c-7fe6-4c98-b45a-01296d21caf4" containerName="oc" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.733658 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.735797 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5h45f" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.738171 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.739079 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.743511 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-r4g8r" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.745933 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.756738 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.757551 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.758836 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-z62mx" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.768406 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.780264 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.798273 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.820490 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.829204 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hzc7b" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.940979 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.942232 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.943054 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6d2w\" (UniqueName: \"kubernetes.io/projected/5525e521-3469-4b99-8fc0-0894d01bfbb1-kube-api-access-p6d2w\") pod \"designate-operator-controller-manager-66d56f6ff4-gsvwz\" (UID: \"5525e521-3469-4b99-8fc0-0894d01bfbb1\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.943097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmd82\" (UniqueName: \"kubernetes.io/projected/33b59c71-8ce1-468d-b617-50bc021d41b2-kube-api-access-zmd82\") pod \"cinder-operator-controller-manager-984cd4dcf-97gbz\" (UID: \"33b59c71-8ce1-468d-b617-50bc021d41b2\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.943190 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xskzj\" (UniqueName: \"kubernetes.io/projected/b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8-kube-api-access-xskzj\") pod \"barbican-operator-controller-manager-677bd678f7-8ppkz\" (UID: \"b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.943226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7m2j\" (UniqueName: \"kubernetes.io/projected/4b4416c1-939f-4740-bd01-f45d9cfd8822-kube-api-access-t7m2j\") pod \"glance-operator-controller-manager-5964f64c48-bdbfx\" (UID: \"4b4416c1-939f-4740-bd01-f45d9cfd8822\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.951641 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.970615 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jmnvn" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.987250 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m"] Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.988617 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" Mar 10 15:24:08 crc kubenswrapper[4743]: I0310 15:24:08.997235 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4dk72" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.018960 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.020132 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.024713 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-884rg" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.025897 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.027274 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.039704 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.040022 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.042797 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9rw78" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.044442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xskzj\" (UniqueName: \"kubernetes.io/projected/b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8-kube-api-access-xskzj\") pod \"barbican-operator-controller-manager-677bd678f7-8ppkz\" (UID: \"b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.044510 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7m2j\" (UniqueName: \"kubernetes.io/projected/4b4416c1-939f-4740-bd01-f45d9cfd8822-kube-api-access-t7m2j\") pod \"glance-operator-controller-manager-5964f64c48-bdbfx\" (UID: \"4b4416c1-939f-4740-bd01-f45d9cfd8822\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.044556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6d2w\" (UniqueName: \"kubernetes.io/projected/5525e521-3469-4b99-8fc0-0894d01bfbb1-kube-api-access-p6d2w\") pod \"designate-operator-controller-manager-66d56f6ff4-gsvwz\" (UID: \"5525e521-3469-4b99-8fc0-0894d01bfbb1\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.044577 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmd82\" (UniqueName: \"kubernetes.io/projected/33b59c71-8ce1-468d-b617-50bc021d41b2-kube-api-access-zmd82\") pod \"cinder-operator-controller-manager-984cd4dcf-97gbz\" (UID: \"33b59c71-8ce1-468d-b617-50bc021d41b2\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.044629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7n9\" (UniqueName: \"kubernetes.io/projected/8485c807-58f7-45c6-845f-1bc881558553-kube-api-access-pq7n9\") pod \"heat-operator-controller-manager-77b6666d85-lcr6h\" (UID: \"8485c807-58f7-45c6-845f-1bc881558553\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.045389 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.064720 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.078014 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.078185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmd82\" (UniqueName: \"kubernetes.io/projected/33b59c71-8ce1-468d-b617-50bc021d41b2-kube-api-access-zmd82\") pod \"cinder-operator-controller-manager-984cd4dcf-97gbz\" (UID: \"33b59c71-8ce1-468d-b617-50bc021d41b2\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.084525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xskzj\" (UniqueName: \"kubernetes.io/projected/b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8-kube-api-access-xskzj\") pod \"barbican-operator-controller-manager-677bd678f7-8ppkz\" (UID: \"b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.091963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6d2w\" (UniqueName: \"kubernetes.io/projected/5525e521-3469-4b99-8fc0-0894d01bfbb1-kube-api-access-p6d2w\") pod \"designate-operator-controller-manager-66d56f6ff4-gsvwz\" (UID: \"5525e521-3469-4b99-8fc0-0894d01bfbb1\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.103977 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7m2j\" (UniqueName: \"kubernetes.io/projected/4b4416c1-939f-4740-bd01-f45d9cfd8822-kube-api-access-t7m2j\") pod \"glance-operator-controller-manager-5964f64c48-bdbfx\" (UID: \"4b4416c1-939f-4740-bd01-f45d9cfd8822\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.106398 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.111329 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.116406 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dkm5n" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.146497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7n9\" (UniqueName: \"kubernetes.io/projected/8485c807-58f7-45c6-845f-1bc881558553-kube-api-access-pq7n9\") pod \"heat-operator-controller-manager-77b6666d85-lcr6h\" (UID: \"8485c807-58f7-45c6-845f-1bc881558553\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.146553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jl9z\" (UniqueName: \"kubernetes.io/projected/bf42ffd4-e446-4dee-b343-b8d64dcb8e2c-kube-api-access-9jl9z\") pod \"horizon-operator-controller-manager-6d9d6b584d-tsd2m\" (UID: \"bf42ffd4-e446-4dee-b343-b8d64dcb8e2c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.146664 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.146693 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6ptm\" (UniqueName: \"kubernetes.io/projected/45ac07a3-b1f8-4232-937d-6a7275aac026-kube-api-access-m6ptm\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.146866 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dmwf\" (UniqueName: \"kubernetes.io/projected/364fba36-380b-47fb-9f0d-38585fb94bac-kube-api-access-4dmwf\") pod \"ironic-operator-controller-manager-6bbb499bbc-4vj77\" (UID: \"364fba36-380b-47fb-9f0d-38585fb94bac\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.156429 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.158066 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.161089 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2ghz8" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.163859 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7n9\" (UniqueName: \"kubernetes.io/projected/8485c807-58f7-45c6-845f-1bc881558553-kube-api-access-pq7n9\") pod \"heat-operator-controller-manager-77b6666d85-lcr6h\" (UID: \"8485c807-58f7-45c6-845f-1bc881558553\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.168870 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.169926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.172659 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-5nj6v" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.176153 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.181319 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.191700 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.205959 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.228676 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.229724 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.233589 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sv6xn" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.237446 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.243156 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.248452 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dtl6f" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.248675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49gkt\" (UniqueName: \"kubernetes.io/projected/a23de576-a42f-4a68-8e30-5a71791b89e0-kube-api-access-49gkt\") pod \"keystone-operator-controller-manager-684f77d66d-r4gcw\" (UID: \"a23de576-a42f-4a68-8e30-5a71791b89e0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.248746 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.248766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6ptm\" (UniqueName: \"kubernetes.io/projected/45ac07a3-b1f8-4232-937d-6a7275aac026-kube-api-access-m6ptm\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.248831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dmwf\" (UniqueName: \"kubernetes.io/projected/364fba36-380b-47fb-9f0d-38585fb94bac-kube-api-access-4dmwf\") pod \"ironic-operator-controller-manager-6bbb499bbc-4vj77\" (UID: \"364fba36-380b-47fb-9f0d-38585fb94bac\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.248870 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wj79\" (UniqueName: \"kubernetes.io/projected/13525a0b-792c-4304-9b21-4ff84f391e20-kube-api-access-5wj79\") pod \"manila-operator-controller-manager-68f45f9d9f-vcztk\" (UID: \"13525a0b-792c-4304-9b21-4ff84f391e20\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.248897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jl9z\" (UniqueName: \"kubernetes.io/projected/bf42ffd4-e446-4dee-b343-b8d64dcb8e2c-kube-api-access-9jl9z\") pod \"horizon-operator-controller-manager-6d9d6b584d-tsd2m\" (UID: \"bf42ffd4-e446-4dee-b343-b8d64dcb8e2c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.249538 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.249602 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert podName:45ac07a3-b1f8-4232-937d-6a7275aac026 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:09.749581109 +0000 UTC m=+1114.456395857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert") pod "infra-operator-controller-manager-5995f4446f-cvxq6" (UID: "45ac07a3-b1f8-4232-937d-6a7275aac026") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.267172 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dmwf\" (UniqueName: \"kubernetes.io/projected/364fba36-380b-47fb-9f0d-38585fb94bac-kube-api-access-4dmwf\") pod \"ironic-operator-controller-manager-6bbb499bbc-4vj77\" (UID: \"364fba36-380b-47fb-9f0d-38585fb94bac\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.268396 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jl9z\" (UniqueName: \"kubernetes.io/projected/bf42ffd4-e446-4dee-b343-b8d64dcb8e2c-kube-api-access-9jl9z\") pod \"horizon-operator-controller-manager-6d9d6b584d-tsd2m\" (UID: \"bf42ffd4-e446-4dee-b343-b8d64dcb8e2c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.271743 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6ptm\" (UniqueName: \"kubernetes.io/projected/45ac07a3-b1f8-4232-937d-6a7275aac026-kube-api-access-m6ptm\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.276268 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.276529 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.277528 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.281173 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-l5rns" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.301738 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.324236 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.324290 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.327307 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.335907 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.336612 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.337217 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.337736 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.339375 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.339520 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tdcq8" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.341680 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2xb4r" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.342246 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.343125 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.349223 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.350400 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.350529 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.351280 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sngqw\" (UniqueName: \"kubernetes.io/projected/7b3fecc3-bc35-40a2-b927-65cda4fbe04d-kube-api-access-sngqw\") pod \"nova-operator-controller-manager-569cc54c5-vqpbs\" (UID: \"7b3fecc3-bc35-40a2-b927-65cda4fbe04d\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.351319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wd2z\" (UniqueName: \"kubernetes.io/projected/38bf9e64-a759-40d7-a3f9-f443299160ec-kube-api-access-7wd2z\") pod \"neutron-operator-controller-manager-776c5696bf-v5fw5\" (UID: \"38bf9e64-a759-40d7-a3f9-f443299160ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.351364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wj79\" (UniqueName: \"kubernetes.io/projected/13525a0b-792c-4304-9b21-4ff84f391e20-kube-api-access-5wj79\") pod \"manila-operator-controller-manager-68f45f9d9f-vcztk\" (UID: \"13525a0b-792c-4304-9b21-4ff84f391e20\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.351391 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzlhx\" (UniqueName: \"kubernetes.io/projected/e17c0b05-44be-478d-8674-fe42a01f9397-kube-api-access-vzlhx\") pod \"mariadb-operator-controller-manager-658d4cdd5-s8gf8\" (UID: \"e17c0b05-44be-478d-8674-fe42a01f9397\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.351441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49gkt\" (UniqueName: \"kubernetes.io/projected/a23de576-a42f-4a68-8e30-5a71791b89e0-kube-api-access-49gkt\") pod \"keystone-operator-controller-manager-684f77d66d-r4gcw\" (UID: \"a23de576-a42f-4a68-8e30-5a71791b89e0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.353196 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.354892 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-g69js" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.355215 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bm5bs" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.365513 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.365955 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.366639 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.375394 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.375571 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.383582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49gkt\" (UniqueName: \"kubernetes.io/projected/a23de576-a42f-4a68-8e30-5a71791b89e0-kube-api-access-49gkt\") pod \"keystone-operator-controller-manager-684f77d66d-r4gcw\" (UID: \"a23de576-a42f-4a68-8e30-5a71791b89e0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.452513 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.454644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w5vx\" (UniqueName: \"kubernetes.io/projected/fa1fe66b-2fd5-40fe-8062-9fd96edd519a-kube-api-access-5w5vx\") pod \"ovn-operator-controller-manager-bbc5b68f9-wbbgz\" (UID: \"fa1fe66b-2fd5-40fe-8062-9fd96edd519a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.454717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzlhx\" (UniqueName: \"kubernetes.io/projected/e17c0b05-44be-478d-8674-fe42a01f9397-kube-api-access-vzlhx\") pod \"mariadb-operator-controller-manager-658d4cdd5-s8gf8\" (UID: \"e17c0b05-44be-478d-8674-fe42a01f9397\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.454771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tm88\" (UniqueName: \"kubernetes.io/projected/a70223a5-4e89-4b37-a031-6b93350120c2-kube-api-access-2tm88\") pod \"octavia-operator-controller-manager-5f4f55cb5c-kn6vc\" (UID: \"a70223a5-4e89-4b37-a031-6b93350120c2\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.454800 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqsvh\" (UniqueName: \"kubernetes.io/projected/a9eaff72-01ad-4abf-a515-92bdda950b0f-kube-api-access-hqsvh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.454845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.454875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4sh\" (UniqueName: \"kubernetes.io/projected/e6d20795-5386-4ef1-929b-21659af4ecc6-kube-api-access-hv4sh\") pod \"placement-operator-controller-manager-574d45c66c-s7fnn\" (UID: \"e6d20795-5386-4ef1-929b-21659af4ecc6\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.454928 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sngqw\" (UniqueName: \"kubernetes.io/projected/7b3fecc3-bc35-40a2-b927-65cda4fbe04d-kube-api-access-sngqw\") pod \"nova-operator-controller-manager-569cc54c5-vqpbs\" (UID: \"7b3fecc3-bc35-40a2-b927-65cda4fbe04d\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.454964 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wd2z\" (UniqueName: \"kubernetes.io/projected/38bf9e64-a759-40d7-a3f9-f443299160ec-kube-api-access-7wd2z\") pod \"neutron-operator-controller-manager-776c5696bf-v5fw5\" (UID: \"38bf9e64-a759-40d7-a3f9-f443299160ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.455003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhfwk\" (UniqueName: \"kubernetes.io/projected/7a579dd7-32af-4705-913a-12ec54a71572-kube-api-access-rhfwk\") pod \"swift-operator-controller-manager-677c674df7-jfbcg\" (UID: \"7a579dd7-32af-4705-913a-12ec54a71572\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.462078 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.464983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wj79\" (UniqueName: \"kubernetes.io/projected/13525a0b-792c-4304-9b21-4ff84f391e20-kube-api-access-5wj79\") pod \"manila-operator-controller-manager-68f45f9d9f-vcztk\" (UID: \"13525a0b-792c-4304-9b21-4ff84f391e20\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.503759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sngqw\" (UniqueName: \"kubernetes.io/projected/7b3fecc3-bc35-40a2-b927-65cda4fbe04d-kube-api-access-sngqw\") pod \"nova-operator-controller-manager-569cc54c5-vqpbs\" (UID: \"7b3fecc3-bc35-40a2-b927-65cda4fbe04d\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.504455 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wd2z\" (UniqueName: \"kubernetes.io/projected/38bf9e64-a759-40d7-a3f9-f443299160ec-kube-api-access-7wd2z\") pod \"neutron-operator-controller-manager-776c5696bf-v5fw5\" (UID: \"38bf9e64-a759-40d7-a3f9-f443299160ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.505849 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.506004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzlhx\" (UniqueName: \"kubernetes.io/projected/e17c0b05-44be-478d-8674-fe42a01f9397-kube-api-access-vzlhx\") pod \"mariadb-operator-controller-manager-658d4cdd5-s8gf8\" (UID: \"e17c0b05-44be-478d-8674-fe42a01f9397\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.507179 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.509061 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.510712 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4wzhp" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.565352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tm88\" (UniqueName: \"kubernetes.io/projected/a70223a5-4e89-4b37-a031-6b93350120c2-kube-api-access-2tm88\") pod \"octavia-operator-controller-manager-5f4f55cb5c-kn6vc\" (UID: \"a70223a5-4e89-4b37-a031-6b93350120c2\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.565441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqsvh\" (UniqueName: \"kubernetes.io/projected/a9eaff72-01ad-4abf-a515-92bdda950b0f-kube-api-access-hqsvh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.568461 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.568563 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert podName:a9eaff72-01ad-4abf-a515-92bdda950b0f nodeName:}" failed. No retries permitted until 2026-03-10 15:24:10.068539163 +0000 UTC m=+1114.775353911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" (UID: "a9eaff72-01ad-4abf-a515-92bdda950b0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.568911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.569000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4sh\" (UniqueName: \"kubernetes.io/projected/e6d20795-5386-4ef1-929b-21659af4ecc6-kube-api-access-hv4sh\") pod \"placement-operator-controller-manager-574d45c66c-s7fnn\" (UID: \"e6d20795-5386-4ef1-929b-21659af4ecc6\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.569135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhfwk\" (UniqueName: \"kubernetes.io/projected/7a579dd7-32af-4705-913a-12ec54a71572-kube-api-access-rhfwk\") pod \"swift-operator-controller-manager-677c674df7-jfbcg\" (UID: \"7a579dd7-32af-4705-913a-12ec54a71572\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.569186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w5vx\" (UniqueName: \"kubernetes.io/projected/fa1fe66b-2fd5-40fe-8062-9fd96edd519a-kube-api-access-5w5vx\") pod \"ovn-operator-controller-manager-bbc5b68f9-wbbgz\" (UID: \"fa1fe66b-2fd5-40fe-8062-9fd96edd519a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.569887 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.576052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.600239 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w5vx\" (UniqueName: \"kubernetes.io/projected/fa1fe66b-2fd5-40fe-8062-9fd96edd519a-kube-api-access-5w5vx\") pod \"ovn-operator-controller-manager-bbc5b68f9-wbbgz\" (UID: \"fa1fe66b-2fd5-40fe-8062-9fd96edd519a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.600868 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.614002 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhfwk\" (UniqueName: \"kubernetes.io/projected/7a579dd7-32af-4705-913a-12ec54a71572-kube-api-access-rhfwk\") pod \"swift-operator-controller-manager-677c674df7-jfbcg\" (UID: \"7a579dd7-32af-4705-913a-12ec54a71572\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.614155 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4sh\" (UniqueName: \"kubernetes.io/projected/e6d20795-5386-4ef1-929b-21659af4ecc6-kube-api-access-hv4sh\") pod \"placement-operator-controller-manager-574d45c66c-s7fnn\" (UID: \"e6d20795-5386-4ef1-929b-21659af4ecc6\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.614824 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tm88\" (UniqueName: \"kubernetes.io/projected/a70223a5-4e89-4b37-a031-6b93350120c2-kube-api-access-2tm88\") pod \"octavia-operator-controller-manager-5f4f55cb5c-kn6vc\" (UID: \"a70223a5-4e89-4b37-a031-6b93350120c2\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.615157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqsvh\" (UniqueName: \"kubernetes.io/projected/a9eaff72-01ad-4abf-a515-92bdda950b0f-kube-api-access-hqsvh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.619462 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.620616 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.623944 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-ncv7r" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.624848 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.659335 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.661253 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.665898 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-qk575" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.666517 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.675055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkzrj\" (UniqueName: \"kubernetes.io/projected/bd198f14-3a65-4742-bf29-7938e52d3284-kube-api-access-kkzrj\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-hfrrc\" (UID: \"bd198f14-3a65-4742-bf29-7938e52d3284\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.683980 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.704104 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.749617 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.779269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkzrj\" (UniqueName: \"kubernetes.io/projected/bd198f14-3a65-4742-bf29-7938e52d3284-kube-api-access-kkzrj\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-hfrrc\" (UID: \"bd198f14-3a65-4742-bf29-7938e52d3284\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.779380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.779432 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7b4\" (UniqueName: \"kubernetes.io/projected/6d72554f-3af0-4d26-91f6-f0375b131c31-kube-api-access-7b7b4\") pod \"watcher-operator-controller-manager-6dd88c6f67-4652x\" (UID: \"6d72554f-3af0-4d26-91f6-f0375b131c31\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.779526 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d62n\" (UniqueName: \"kubernetes.io/projected/65c955aa-82e1-4acc-9c36-7029482fcac3-kube-api-access-4d62n\") pod \"test-operator-controller-manager-5c5cb9c4d7-kmmz6\" (UID: \"65c955aa-82e1-4acc-9c36-7029482fcac3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.779662 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.779708 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert podName:45ac07a3-b1f8-4232-937d-6a7275aac026 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:10.779692687 +0000 UTC m=+1115.486507435 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert") pod "infra-operator-controller-manager-5995f4446f-cvxq6" (UID: "45ac07a3-b1f8-4232-937d-6a7275aac026") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.791569 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.792901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.799694 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.800596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkzrj\" (UniqueName: \"kubernetes.io/projected/bd198f14-3a65-4742-bf29-7938e52d3284-kube-api-access-kkzrj\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-hfrrc\" (UID: \"bd198f14-3a65-4742-bf29-7938e52d3284\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.812615 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.818324 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-t564p" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.818552 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.837182 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.859541 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.875409 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.876743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.890403 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7b4\" (UniqueName: \"kubernetes.io/projected/6d72554f-3af0-4d26-91f6-f0375b131c31-kube-api-access-7b7b4\") pod \"watcher-operator-controller-manager-6dd88c6f67-4652x\" (UID: \"6d72554f-3af0-4d26-91f6-f0375b131c31\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.890448 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.890533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d62n\" (UniqueName: \"kubernetes.io/projected/65c955aa-82e1-4acc-9c36-7029482fcac3-kube-api-access-4d62n\") pod \"test-operator-controller-manager-5c5cb9c4d7-kmmz6\" (UID: \"65c955aa-82e1-4acc-9c36-7029482fcac3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.890560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7d9f\" (UniqueName: \"kubernetes.io/projected/9f66984b-7e2a-4644-892d-96c8a0268ab6-kube-api-access-x7d9f\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.890579 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.890878 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.915807 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-4pm7q" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.920531 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.926585 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7b4\" (UniqueName: \"kubernetes.io/projected/6d72554f-3af0-4d26-91f6-f0375b131c31-kube-api-access-7b7b4\") pod \"watcher-operator-controller-manager-6dd88c6f67-4652x\" (UID: \"6d72554f-3af0-4d26-91f6-f0375b131c31\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.947576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d62n\" (UniqueName: \"kubernetes.io/projected/65c955aa-82e1-4acc-9c36-7029482fcac3-kube-api-access-4d62n\") pod \"test-operator-controller-manager-5c5cb9c4d7-kmmz6\" (UID: \"65c955aa-82e1-4acc-9c36-7029482fcac3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.949427 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.976179 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.981871 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h"] Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.992129 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kflcz\" (UniqueName: \"kubernetes.io/projected/94a79fc9-d43a-4a8d-9d33-1bcfd7306537-kube-api-access-kflcz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pmnxq\" (UID: \"94a79fc9-d43a-4a8d-9d33-1bcfd7306537\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.992409 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7d9f\" (UniqueName: \"kubernetes.io/projected/9f66984b-7e2a-4644-892d-96c8a0268ab6-kube-api-access-x7d9f\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.992473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:09 crc kubenswrapper[4743]: I0310 15:24:09.992619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.992837 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.992930 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:10.49290267 +0000 UTC m=+1115.199717418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "webhook-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.996404 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:24:09 crc kubenswrapper[4743]: E0310 15:24:09.996458 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:10.496441542 +0000 UTC m=+1115.203256290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "metrics-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.014627 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.019434 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7d9f\" (UniqueName: \"kubernetes.io/projected/9f66984b-7e2a-4644-892d-96c8a0268ab6-kube-api-access-x7d9f\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.093720 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kflcz\" (UniqueName: \"kubernetes.io/projected/94a79fc9-d43a-4a8d-9d33-1bcfd7306537-kube-api-access-kflcz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pmnxq\" (UID: \"94a79fc9-d43a-4a8d-9d33-1bcfd7306537\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.094440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:10 crc kubenswrapper[4743]: E0310 15:24:10.098439 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: E0310 15:24:10.098593 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert podName:a9eaff72-01ad-4abf-a515-92bdda950b0f nodeName:}" failed. No retries permitted until 2026-03-10 15:24:11.098511238 +0000 UTC m=+1115.805326186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" (UID: "a9eaff72-01ad-4abf-a515-92bdda950b0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.132651 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kflcz\" (UniqueName: \"kubernetes.io/projected/94a79fc9-d43a-4a8d-9d33-1bcfd7306537-kube-api-access-kflcz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pmnxq\" (UID: \"94a79fc9-d43a-4a8d-9d33-1bcfd7306537\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.157187 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m"] Mar 10 15:24:10 crc kubenswrapper[4743]: W0310 15:24:10.237008 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf42ffd4_e446_4dee_b343_b8d64dcb8e2c.slice/crio-51b3f6736ff55cfea5c2f77afebe98c61c19030943b62eb3f61fbdc3a5970b65 WatchSource:0}: Error finding container 51b3f6736ff55cfea5c2f77afebe98c61c19030943b62eb3f61fbdc3a5970b65: Status 404 returned error can't find the container with id 51b3f6736ff55cfea5c2f77afebe98c61c19030943b62eb3f61fbdc3a5970b65 Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.283855 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.508120 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.508605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:10 crc kubenswrapper[4743]: E0310 15:24:10.508309 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: E0310 15:24:10.508910 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:11.508895602 +0000 UTC m=+1116.215710350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "metrics-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: E0310 15:24:10.508855 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: E0310 15:24:10.509269 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:11.509243862 +0000 UTC m=+1116.216058610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "webhook-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.820154 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77"] Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.824877 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:10 crc kubenswrapper[4743]: E0310 15:24:10.825010 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: E0310 15:24:10.825059 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert podName:45ac07a3-b1f8-4232-937d-6a7275aac026 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:12.825045866 +0000 UTC m=+1117.531860614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert") pod "infra-operator-controller-manager-5995f4446f-cvxq6" (UID: "45ac07a3-b1f8-4232-937d-6a7275aac026") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.868618 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz"] Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.883408 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz"] Mar 10 15:24:10 crc kubenswrapper[4743]: W0310 15:24:10.888061 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39bb6a2_9c5a_4acc_bbae_fe5d63a971c8.slice/crio-1933cd491d4c5b56061e0db2ec05e6483d518e69788952ae28abf6870bb780cd WatchSource:0}: Error finding container 1933cd491d4c5b56061e0db2ec05e6483d518e69788952ae28abf6870bb780cd: Status 404 returned error can't find the container with id 1933cd491d4c5b56061e0db2ec05e6483d518e69788952ae28abf6870bb780cd Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.902097 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz"] Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.934763 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" event={"ID":"b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8","Type":"ContainerStarted","Data":"1933cd491d4c5b56061e0db2ec05e6483d518e69788952ae28abf6870bb780cd"} Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.941123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" event={"ID":"8485c807-58f7-45c6-845f-1bc881558553","Type":"ContainerStarted","Data":"b4ae4d6c9045844a6f9889cc00c2a7be364c9b47e5bd6e6e5590eb14dc52b6c0"} Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.942711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" event={"ID":"bf42ffd4-e446-4dee-b343-b8d64dcb8e2c","Type":"ContainerStarted","Data":"51b3f6736ff55cfea5c2f77afebe98c61c19030943b62eb3f61fbdc3a5970b65"} Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.943720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" event={"ID":"364fba36-380b-47fb-9f0d-38585fb94bac","Type":"ContainerStarted","Data":"efb314ecdea9c6f387b31493ee74663d12dcd76e32b0a741d2473f29f62e8dd3"} Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.944697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" event={"ID":"4b4416c1-939f-4740-bd01-f45d9cfd8822","Type":"ContainerStarted","Data":"f639d03b52cf08e8965a0a1630ed5428e33dffc114a2ab26f593e1557cba372e"} Mar 10 15:24:10 crc kubenswrapper[4743]: I0310 15:24:10.998282 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8"] Mar 10 15:24:11 crc kubenswrapper[4743]: W0310 15:24:11.019541 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode17c0b05_44be_478d_8674_fe42a01f9397.slice/crio-4a93028942bac31eb972b551316c2e01caf31aea029c4b33387f8f80a87f856d WatchSource:0}: Error finding container 4a93028942bac31eb972b551316c2e01caf31aea029c4b33387f8f80a87f856d: Status 404 returned error can't find the container with id 4a93028942bac31eb972b551316c2e01caf31aea029c4b33387f8f80a87f856d Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.030723 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5"] Mar 10 15:24:11 crc kubenswrapper[4743]: W0310 15:24:11.038787 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38bf9e64_a759_40d7_a3f9_f443299160ec.slice/crio-38d9146a782400a9b5167e02eafb6bf3f96e8a07e0af4751f3a811979f002ce6 WatchSource:0}: Error finding container 38d9146a782400a9b5167e02eafb6bf3f96e8a07e0af4751f3a811979f002ce6: Status 404 returned error can't find the container with id 38d9146a782400a9b5167e02eafb6bf3f96e8a07e0af4751f3a811979f002ce6 Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.070464 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.131299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.131649 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.131723 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert podName:a9eaff72-01ad-4abf-a515-92bdda950b0f nodeName:}" failed. No retries permitted until 2026-03-10 15:24:13.131702566 +0000 UTC m=+1117.838517314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" (UID: "a9eaff72-01ad-4abf-a515-92bdda950b0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.379779 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.396017 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.413866 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.420223 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.427235 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.436991 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.446592 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.454600 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.461321 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.466762 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq"] Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.540606 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.540735 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.540841 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.540938 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.540957 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:13.540931737 +0000 UTC m=+1118.247746535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "metrics-server-cert" not found Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.540996 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:13.540979028 +0000 UTC m=+1118.247793856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "webhook-server-cert" not found Mar 10 15:24:11 crc kubenswrapper[4743]: W0310 15:24:11.576922 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda23de576_a42f_4a68_8e30_5a71791b89e0.slice/crio-4bcf7d7e00aea6ecdc083ffc4d1cfa634e3acc2cc99f61ffe4a716f64074bac3 WatchSource:0}: Error finding container 4bcf7d7e00aea6ecdc083ffc4d1cfa634e3acc2cc99f61ffe4a716f64074bac3: Status 404 returned error can't find the container with id 4bcf7d7e00aea6ecdc083ffc4d1cfa634e3acc2cc99f61ffe4a716f64074bac3 Mar 10 15:24:11 crc kubenswrapper[4743]: W0310 15:24:11.579246 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d20795_5386_4ef1_929b_21659af4ecc6.slice/crio-ccba8284be6b1cd7ef92477e3f549e92a2ce1ae1beacae81b83128ca53bab0e2 WatchSource:0}: Error finding container ccba8284be6b1cd7ef92477e3f549e92a2ce1ae1beacae81b83128ca53bab0e2: Status 404 returned error can't find the container with id ccba8284be6b1cd7ef92477e3f549e92a2ce1ae1beacae81b83128ca53bab0e2 Mar 10 15:24:11 crc kubenswrapper[4743]: W0310 15:24:11.581570 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13525a0b_792c_4304_9b21_4ff84f391e20.slice/crio-73a3684949b8e6b0e9688c829d3315827c157d91e2db4eade875323f930f0f49 WatchSource:0}: Error finding container 73a3684949b8e6b0e9688c829d3315827c157d91e2db4eade875323f930f0f49: Status 404 returned error can't find the container with id 73a3684949b8e6b0e9688c829d3315827c157d91e2db4eade875323f930f0f49 Mar 10 15:24:11 crc kubenswrapper[4743]: W0310 15:24:11.623264 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda70223a5_4e89_4b37_a031_6b93350120c2.slice/crio-164bc2584408d7749b07dcafff2a2f4f38fb1b02ad8fca075bda616bfcdb37fb WatchSource:0}: Error finding container 164bc2584408d7749b07dcafff2a2f4f38fb1b02ad8fca075bda616bfcdb37fb: Status 404 returned error can't find the container with id 164bc2584408d7749b07dcafff2a2f4f38fb1b02ad8fca075bda616bfcdb37fb Mar 10 15:24:11 crc kubenswrapper[4743]: W0310 15:24:11.629095 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa1fe66b_2fd5_40fe_8062_9fd96edd519a.slice/crio-e79353f618bc308ac09025b8058aba88d3248a33d9ae97b3fb0a140191ae2ee7 WatchSource:0}: Error finding container e79353f618bc308ac09025b8058aba88d3248a33d9ae97b3fb0a140191ae2ee7: Status 404 returned error can't find the container with id e79353f618bc308ac09025b8058aba88d3248a33d9ae97b3fb0a140191ae2ee7 Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.634776 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5w5vx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-wbbgz_openstack-operators(fa1fe66b-2fd5-40fe-8062-9fd96edd519a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.636943 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" podUID="fa1fe66b-2fd5-40fe-8062-9fd96edd519a" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.640046 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2tm88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-kn6vc_openstack-operators(a70223a5-4e89-4b37-a031-6b93350120c2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.641290 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" podUID="a70223a5-4e89-4b37-a031-6b93350120c2" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.646181 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7b7b4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-4652x_openstack-operators(6d72554f-3af0-4d26-91f6-f0375b131c31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.647988 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" podUID="6d72554f-3af0-4d26-91f6-f0375b131c31" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.648922 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kflcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pmnxq_openstack-operators(94a79fc9-d43a-4a8d-9d33-1bcfd7306537): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.650022 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" podUID="94a79fc9-d43a-4a8d-9d33-1bcfd7306537" Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.954903 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" event={"ID":"7b3fecc3-bc35-40a2-b927-65cda4fbe04d","Type":"ContainerStarted","Data":"725d6bb8d991cdd051f2c08368ab600672c5bb9787fca15fb1713d07ab2185ac"} Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.957317 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" event={"ID":"38bf9e64-a759-40d7-a3f9-f443299160ec","Type":"ContainerStarted","Data":"38d9146a782400a9b5167e02eafb6bf3f96e8a07e0af4751f3a811979f002ce6"} Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.960046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" event={"ID":"13525a0b-792c-4304-9b21-4ff84f391e20","Type":"ContainerStarted","Data":"73a3684949b8e6b0e9688c829d3315827c157d91e2db4eade875323f930f0f49"} Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.961017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" event={"ID":"e17c0b05-44be-478d-8674-fe42a01f9397","Type":"ContainerStarted","Data":"4a93028942bac31eb972b551316c2e01caf31aea029c4b33387f8f80a87f856d"} Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.962525 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" event={"ID":"33b59c71-8ce1-468d-b617-50bc021d41b2","Type":"ContainerStarted","Data":"6eb3351237c6fdb4008fec382b2ec88c2cdbfa2d5202c7909da488e3fec7ef63"} Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.965461 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" event={"ID":"5525e521-3469-4b99-8fc0-0894d01bfbb1","Type":"ContainerStarted","Data":"5c93d58149abdfed1a734a1a6e7ceffdba008e912bbb7bfb1a8ee41aed2974b5"} Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.969670 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" event={"ID":"94a79fc9-d43a-4a8d-9d33-1bcfd7306537","Type":"ContainerStarted","Data":"e6442a1146246c924dfd5a332441d27e473468c1f069e4b54d50486782b4c0f7"} Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.971053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" event={"ID":"e6d20795-5386-4ef1-929b-21659af4ecc6","Type":"ContainerStarted","Data":"ccba8284be6b1cd7ef92477e3f549e92a2ce1ae1beacae81b83128ca53bab0e2"} Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.972606 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" podUID="94a79fc9-d43a-4a8d-9d33-1bcfd7306537" Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.974215 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" event={"ID":"a23de576-a42f-4a68-8e30-5a71791b89e0","Type":"ContainerStarted","Data":"4bcf7d7e00aea6ecdc083ffc4d1cfa634e3acc2cc99f61ffe4a716f64074bac3"} Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.990891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" event={"ID":"6d72554f-3af0-4d26-91f6-f0375b131c31","Type":"ContainerStarted","Data":"daefe774bf93002c13766502d0a1a41e7d8e59f0c6276167401db170ca4a0594"} Mar 10 15:24:11 crc kubenswrapper[4743]: E0310 15:24:11.995942 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" podUID="6d72554f-3af0-4d26-91f6-f0375b131c31" Mar 10 15:24:11 crc kubenswrapper[4743]: I0310 15:24:11.997442 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" event={"ID":"7a579dd7-32af-4705-913a-12ec54a71572","Type":"ContainerStarted","Data":"9b9e98d73c09ddc93f3b410a79a3c08fdf741f746f3fc89029d1c957b035abce"} Mar 10 15:24:12 crc kubenswrapper[4743]: I0310 15:24:12.007473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" event={"ID":"a70223a5-4e89-4b37-a031-6b93350120c2","Type":"ContainerStarted","Data":"164bc2584408d7749b07dcafff2a2f4f38fb1b02ad8fca075bda616bfcdb37fb"} Mar 10 15:24:12 crc kubenswrapper[4743]: E0310 15:24:12.018696 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" podUID="a70223a5-4e89-4b37-a031-6b93350120c2" Mar 10 15:24:12 crc kubenswrapper[4743]: I0310 15:24:12.029077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" event={"ID":"bd198f14-3a65-4742-bf29-7938e52d3284","Type":"ContainerStarted","Data":"d60e5d426a08e0a6ea5ee55e41c43dedc44b5a7f07aa7ec71f9b78ea0e3ca640"} Mar 10 15:24:12 crc kubenswrapper[4743]: I0310 15:24:12.034124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" event={"ID":"65c955aa-82e1-4acc-9c36-7029482fcac3","Type":"ContainerStarted","Data":"dc5cc7be6a1b8547fdc58e5daba951e9899c8c5d04f3de7ccfdc6a37f2d528f8"} Mar 10 15:24:12 crc kubenswrapper[4743]: I0310 15:24:12.043479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" event={"ID":"fa1fe66b-2fd5-40fe-8062-9fd96edd519a","Type":"ContainerStarted","Data":"e79353f618bc308ac09025b8058aba88d3248a33d9ae97b3fb0a140191ae2ee7"} Mar 10 15:24:12 crc kubenswrapper[4743]: E0310 15:24:12.045607 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" podUID="fa1fe66b-2fd5-40fe-8062-9fd96edd519a" Mar 10 15:24:12 crc kubenswrapper[4743]: I0310 15:24:12.886156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:12 crc kubenswrapper[4743]: E0310 15:24:12.886401 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:12 crc kubenswrapper[4743]: E0310 15:24:12.886482 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert podName:45ac07a3-b1f8-4232-937d-6a7275aac026 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:16.88646327 +0000 UTC m=+1121.593278018 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert") pod "infra-operator-controller-manager-5995f4446f-cvxq6" (UID: "45ac07a3-b1f8-4232-937d-6a7275aac026") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.057389 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" podUID="6d72554f-3af0-4d26-91f6-f0375b131c31" Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.057849 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" podUID="fa1fe66b-2fd5-40fe-8062-9fd96edd519a" Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.058336 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" podUID="94a79fc9-d43a-4a8d-9d33-1bcfd7306537" Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.059559 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" podUID="a70223a5-4e89-4b37-a031-6b93350120c2" Mar 10 15:24:13 crc kubenswrapper[4743]: I0310 15:24:13.193539 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.194056 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.194141 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert podName:a9eaff72-01ad-4abf-a515-92bdda950b0f nodeName:}" failed. No retries permitted until 2026-03-10 15:24:17.194100919 +0000 UTC m=+1121.900915657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" (UID: "a9eaff72-01ad-4abf-a515-92bdda950b0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:13 crc kubenswrapper[4743]: I0310 15:24:13.601995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:13 crc kubenswrapper[4743]: I0310 15:24:13.602118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.602270 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.602329 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:17.602311101 +0000 UTC m=+1122.309125849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "metrics-server-cert" not found Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.602381 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:24:13 crc kubenswrapper[4743]: E0310 15:24:13.602538 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:17.602521627 +0000 UTC m=+1122.309336375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "webhook-server-cert" not found Mar 10 15:24:16 crc kubenswrapper[4743]: I0310 15:24:16.956391 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:16 crc kubenswrapper[4743]: E0310 15:24:16.956682 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:16 crc kubenswrapper[4743]: E0310 15:24:16.957137 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert podName:45ac07a3-b1f8-4232-937d-6a7275aac026 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:24.957108338 +0000 UTC m=+1129.663923086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert") pod "infra-operator-controller-manager-5995f4446f-cvxq6" (UID: "45ac07a3-b1f8-4232-937d-6a7275aac026") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:24:17 crc kubenswrapper[4743]: I0310 15:24:17.263007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:17 crc kubenswrapper[4743]: E0310 15:24:17.263222 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:17 crc kubenswrapper[4743]: E0310 15:24:17.263320 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert podName:a9eaff72-01ad-4abf-a515-92bdda950b0f nodeName:}" failed. No retries permitted until 2026-03-10 15:24:25.263294565 +0000 UTC m=+1129.970109483 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" (UID: "a9eaff72-01ad-4abf-a515-92bdda950b0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:24:17 crc kubenswrapper[4743]: I0310 15:24:17.669939 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:17 crc kubenswrapper[4743]: I0310 15:24:17.670084 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:17 crc kubenswrapper[4743]: E0310 15:24:17.670186 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:24:17 crc kubenswrapper[4743]: E0310 15:24:17.670252 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:24:17 crc kubenswrapper[4743]: E0310 15:24:17.670334 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:25.670283422 +0000 UTC m=+1130.377098170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "webhook-server-cert" not found Mar 10 15:24:17 crc kubenswrapper[4743]: E0310 15:24:17.670367 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:25.670355104 +0000 UTC m=+1130.377170082 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "metrics-server-cert" not found Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.136253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" event={"ID":"bd198f14-3a65-4742-bf29-7938e52d3284","Type":"ContainerStarted","Data":"6f66bde68d00f353a0ae73893418234552cc7fe0cfa1b0e193127daac996410e"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.136884 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.140366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" event={"ID":"bf42ffd4-e446-4dee-b343-b8d64dcb8e2c","Type":"ContainerStarted","Data":"f0b5358f587bfbc75ba7666a1dbb7fb3c3e07a7d1ad2aabf94af2d035d543ea8"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.141103 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.143876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" event={"ID":"e17c0b05-44be-478d-8674-fe42a01f9397","Type":"ContainerStarted","Data":"71ff7923e10306cfb24ebcebfb77bb275f6c8e279768c5d5cfba433b550d8471"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.144638 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.146758 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" event={"ID":"33b59c71-8ce1-468d-b617-50bc021d41b2","Type":"ContainerStarted","Data":"ecdd6725817e6388873bdd94f6e03c0133db529ee3fa82247bca2f112c0d7ba5"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.147043 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.151965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" event={"ID":"8485c807-58f7-45c6-845f-1bc881558553","Type":"ContainerStarted","Data":"0e10b3b8e457508a6c8bcadfeb23fd5227903280ad25511b479d535f170a620e"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.152580 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.154142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" event={"ID":"5525e521-3469-4b99-8fc0-0894d01bfbb1","Type":"ContainerStarted","Data":"983ba6c48b40ae746566aa881e094aed4390a6de08ab7f87ed7bb8a33679dd4d"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.154605 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.159257 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" podStartSLOduration=3.277112262 podStartE2EDuration="14.159244276s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.618295542 +0000 UTC m=+1116.325110290" lastFinishedPulling="2026-03-10 15:24:22.500427526 +0000 UTC m=+1127.207242304" observedRunningTime="2026-03-10 15:24:23.15521319 +0000 UTC m=+1127.862027938" watchObservedRunningTime="2026-03-10 15:24:23.159244276 +0000 UTC m=+1127.866059024" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.182526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" event={"ID":"38bf9e64-a759-40d7-a3f9-f443299160ec","Type":"ContainerStarted","Data":"d6a33651d3bd45f62a0bd55e9439f45ce4708138ecc75bc445d56ee30cda6155"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.183604 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.187643 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" event={"ID":"364fba36-380b-47fb-9f0d-38585fb94bac","Type":"ContainerStarted","Data":"5de06842cc3f63a3e8ca728e22dfbe5b7b22328edea4f4f11303c3b6f5d0d8e6"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.188474 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.191346 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" podStartSLOduration=5.455263981 podStartE2EDuration="15.191332379s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:10.043034522 +0000 UTC m=+1114.749849270" lastFinishedPulling="2026-03-10 15:24:19.77910292 +0000 UTC m=+1124.485917668" observedRunningTime="2026-03-10 15:24:23.189157436 +0000 UTC m=+1127.895972184" watchObservedRunningTime="2026-03-10 15:24:23.191332379 +0000 UTC m=+1127.898147137" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.194521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" event={"ID":"13525a0b-792c-4304-9b21-4ff84f391e20","Type":"ContainerStarted","Data":"b7b6bf4716691b20b599f8f45a09bda0d8eb28ddaff4b37c42f76cd2a05e4a23"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.194734 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.205292 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" event={"ID":"4b4416c1-939f-4740-bd01-f45d9cfd8822","Type":"ContainerStarted","Data":"72db54ec00c59e41e79d1d117491c75bc3ad40dd10f7907dc783403918f24dfd"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.206167 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.211302 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" podStartSLOduration=3.63421788 podStartE2EDuration="15.211289653s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:10.914775476 +0000 UTC m=+1115.621590214" lastFinishedPulling="2026-03-10 15:24:22.491847229 +0000 UTC m=+1127.198661987" observedRunningTime="2026-03-10 15:24:23.206739922 +0000 UTC m=+1127.913554670" watchObservedRunningTime="2026-03-10 15:24:23.211289653 +0000 UTC m=+1127.918104401" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.222117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" event={"ID":"7b3fecc3-bc35-40a2-b927-65cda4fbe04d","Type":"ContainerStarted","Data":"ab59036eebc8b5fc658e639da5dc30e36a1d6dc1a8ba1fa694bd5a1923e4a488"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.222992 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.233555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" event={"ID":"e6d20795-5386-4ef1-929b-21659af4ecc6","Type":"ContainerStarted","Data":"4ff428387d60e2de58f0db54f4cdea8351d8eb7f8f19848f4ddea72a69204b36"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.234406 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.235730 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" event={"ID":"7a579dd7-32af-4705-913a-12ec54a71572","Type":"ContainerStarted","Data":"1504ca9b6872965872a9aa9e0a870bc9e61287dfb50c6868b761bc83649579e4"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.236176 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.237311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" event={"ID":"b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8","Type":"ContainerStarted","Data":"d6ed074461d1c706359094f042feb0a4c50fc2c37e4bb93e93ee7a9322000ce6"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.237521 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.244796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" event={"ID":"65c955aa-82e1-4acc-9c36-7029482fcac3","Type":"ContainerStarted","Data":"50eb2d02f47d1db80e9d2e409957ce13dd0d79a75a634bae1d1495e71f043a0e"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.245528 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.246858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" event={"ID":"a23de576-a42f-4a68-8e30-5a71791b89e0","Type":"ContainerStarted","Data":"eda1d27bde4ba95b7370c2fab4f503a714b18018ecbdc22ae6df338219f14c8e"} Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.247445 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.262617 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" podStartSLOduration=5.726796231 podStartE2EDuration="15.262601749s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:10.243341693 +0000 UTC m=+1114.950156441" lastFinishedPulling="2026-03-10 15:24:19.779147171 +0000 UTC m=+1124.485961959" observedRunningTime="2026-03-10 15:24:23.256229196 +0000 UTC m=+1127.963043954" watchObservedRunningTime="2026-03-10 15:24:23.262601749 +0000 UTC m=+1127.969416497" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.290162 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" podStartSLOduration=6.547471467 podStartE2EDuration="15.290141111s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.03553946 +0000 UTC m=+1115.742354208" lastFinishedPulling="2026-03-10 15:24:19.778209104 +0000 UTC m=+1124.485023852" observedRunningTime="2026-03-10 15:24:23.288930626 +0000 UTC m=+1127.995745374" watchObservedRunningTime="2026-03-10 15:24:23.290141111 +0000 UTC m=+1127.996955859" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.364918 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" podStartSLOduration=3.779737397 podStartE2EDuration="15.364895492s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:10.915323722 +0000 UTC m=+1115.622138460" lastFinishedPulling="2026-03-10 15:24:22.500481797 +0000 UTC m=+1127.207296555" observedRunningTime="2026-03-10 15:24:23.327928478 +0000 UTC m=+1128.034743216" watchObservedRunningTime="2026-03-10 15:24:23.364895492 +0000 UTC m=+1128.071710240" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.365616 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" podStartSLOduration=3.7847361299999998 podStartE2EDuration="15.365608392s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:10.906980112 +0000 UTC m=+1115.613794860" lastFinishedPulling="2026-03-10 15:24:22.487852374 +0000 UTC m=+1127.194667122" observedRunningTime="2026-03-10 15:24:23.357155139 +0000 UTC m=+1128.063969887" watchObservedRunningTime="2026-03-10 15:24:23.365608392 +0000 UTC m=+1128.072423150" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.404768 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" podStartSLOduration=3.512120182 podStartE2EDuration="14.404752208s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.599563643 +0000 UTC m=+1116.306378391" lastFinishedPulling="2026-03-10 15:24:22.492195629 +0000 UTC m=+1127.199010417" observedRunningTime="2026-03-10 15:24:23.4003206 +0000 UTC m=+1128.107135348" watchObservedRunningTime="2026-03-10 15:24:23.404752208 +0000 UTC m=+1128.111566956" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.440361 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" podStartSLOduration=3.458976301 podStartE2EDuration="15.440344552s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:09.962628739 +0000 UTC m=+1114.669443487" lastFinishedPulling="2026-03-10 15:24:21.94399699 +0000 UTC m=+1126.650811738" observedRunningTime="2026-03-10 15:24:23.425045932 +0000 UTC m=+1128.131860680" watchObservedRunningTime="2026-03-10 15:24:23.440344552 +0000 UTC m=+1128.147159300" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.518923 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" podStartSLOduration=3.613070746 podStartE2EDuration="14.518900801s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.583062219 +0000 UTC m=+1116.289876967" lastFinishedPulling="2026-03-10 15:24:22.488892264 +0000 UTC m=+1127.195707022" observedRunningTime="2026-03-10 15:24:23.5118917 +0000 UTC m=+1128.218706438" watchObservedRunningTime="2026-03-10 15:24:23.518900801 +0000 UTC m=+1128.225715549" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.758013 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" podStartSLOduration=4.175310624 podStartE2EDuration="15.757997419s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:10.906343303 +0000 UTC m=+1115.613158051" lastFinishedPulling="2026-03-10 15:24:22.489030058 +0000 UTC m=+1127.195844846" observedRunningTime="2026-03-10 15:24:23.665982862 +0000 UTC m=+1128.372797610" watchObservedRunningTime="2026-03-10 15:24:23.757997419 +0000 UTC m=+1128.464812167" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.862215 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" podStartSLOduration=4.96051828 podStartE2EDuration="15.862200976s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.587192657 +0000 UTC m=+1116.294007405" lastFinishedPulling="2026-03-10 15:24:22.488875353 +0000 UTC m=+1127.195690101" observedRunningTime="2026-03-10 15:24:23.859634782 +0000 UTC m=+1128.566449530" watchObservedRunningTime="2026-03-10 15:24:23.862200976 +0000 UTC m=+1128.569015724" Mar 10 15:24:23 crc kubenswrapper[4743]: I0310 15:24:23.863547 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" podStartSLOduration=4.864288393 podStartE2EDuration="15.863541675s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.587168727 +0000 UTC m=+1116.293983475" lastFinishedPulling="2026-03-10 15:24:22.586422009 +0000 UTC m=+1127.293236757" observedRunningTime="2026-03-10 15:24:23.760285295 +0000 UTC m=+1128.467100053" watchObservedRunningTime="2026-03-10 15:24:23.863541675 +0000 UTC m=+1128.570356423" Mar 10 15:24:24 crc kubenswrapper[4743]: I0310 15:24:24.012710 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" podStartSLOduration=4.555088868 podStartE2EDuration="16.012691465s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.0425168 +0000 UTC m=+1115.749331538" lastFinishedPulling="2026-03-10 15:24:22.500119387 +0000 UTC m=+1127.206934135" observedRunningTime="2026-03-10 15:24:23.935174395 +0000 UTC m=+1128.641989143" watchObservedRunningTime="2026-03-10 15:24:24.012691465 +0000 UTC m=+1128.719506213" Mar 10 15:24:24 crc kubenswrapper[4743]: I0310 15:24:24.087828 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" podStartSLOduration=5.142541046 podStartE2EDuration="16.087794355s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.589247907 +0000 UTC m=+1116.296062645" lastFinishedPulling="2026-03-10 15:24:22.534501206 +0000 UTC m=+1127.241315954" observedRunningTime="2026-03-10 15:24:24.018578814 +0000 UTC m=+1128.725393562" watchObservedRunningTime="2026-03-10 15:24:24.087794355 +0000 UTC m=+1128.794609473" Mar 10 15:24:24 crc kubenswrapper[4743]: I0310 15:24:24.984956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:24 crc kubenswrapper[4743]: I0310 15:24:24.999766 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45ac07a3-b1f8-4232-937d-6a7275aac026-cert\") pod \"infra-operator-controller-manager-5995f4446f-cvxq6\" (UID: \"45ac07a3-b1f8-4232-937d-6a7275aac026\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.278155 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.289943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.296398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9eaff72-01ad-4abf-a515-92bdda950b0f-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w\" (UID: \"a9eaff72-01ad-4abf-a515-92bdda950b0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.405333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.695941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.696048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:25 crc kubenswrapper[4743]: E0310 15:24:25.696277 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:24:25 crc kubenswrapper[4743]: E0310 15:24:25.696413 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs podName:9f66984b-7e2a-4644-892d-96c8a0268ab6 nodeName:}" failed. No retries permitted until 2026-03-10 15:24:41.696394264 +0000 UTC m=+1146.403209012 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-8bnxf" (UID: "9f66984b-7e2a-4644-892d-96c8a0268ab6") : secret "webhook-server-cert" not found Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.710780 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.792714 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" podStartSLOduration=5.91439378 podStartE2EDuration="16.792686594s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.610902689 +0000 UTC m=+1116.317717437" lastFinishedPulling="2026-03-10 15:24:22.489195493 +0000 UTC m=+1127.196010251" observedRunningTime="2026-03-10 15:24:24.098240556 +0000 UTC m=+1128.805055304" watchObservedRunningTime="2026-03-10 15:24:25.792686594 +0000 UTC m=+1130.499501332" Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.794167 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6"] Mar 10 15:24:25 crc kubenswrapper[4743]: I0310 15:24:25.879267 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w"] Mar 10 15:24:25 crc kubenswrapper[4743]: W0310 15:24:25.894111 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9eaff72_01ad_4abf_a515_92bdda950b0f.slice/crio-ecbae4221d0a7003bd6546c14f48f560ad4dc82ee0128f05c6501fc8bb30622c WatchSource:0}: Error finding container ecbae4221d0a7003bd6546c14f48f560ad4dc82ee0128f05c6501fc8bb30622c: Status 404 returned error can't find the container with id ecbae4221d0a7003bd6546c14f48f560ad4dc82ee0128f05c6501fc8bb30622c Mar 10 15:24:26 crc kubenswrapper[4743]: I0310 15:24:26.279123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" event={"ID":"a9eaff72-01ad-4abf-a515-92bdda950b0f","Type":"ContainerStarted","Data":"ecbae4221d0a7003bd6546c14f48f560ad4dc82ee0128f05c6501fc8bb30622c"} Mar 10 15:24:26 crc kubenswrapper[4743]: I0310 15:24:26.281837 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" event={"ID":"45ac07a3-b1f8-4232-937d-6a7275aac026","Type":"ContainerStarted","Data":"f55b47e84d7c68fc0d5b1bfbdc4901654f80282ed5b0894b0908bfae2b2895ab"} Mar 10 15:24:28 crc kubenswrapper[4743]: I0310 15:24:28.304951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" event={"ID":"6d72554f-3af0-4d26-91f6-f0375b131c31","Type":"ContainerStarted","Data":"403eda4c471cc40d0f7618ea9ec357ef1f2a970f90f2da6f3814060352ab172e"} Mar 10 15:24:28 crc kubenswrapper[4743]: I0310 15:24:28.306585 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" Mar 10 15:24:28 crc kubenswrapper[4743]: I0310 15:24:28.309132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" event={"ID":"a70223a5-4e89-4b37-a031-6b93350120c2","Type":"ContainerStarted","Data":"f8207e9dade25988d7af044d65c65ceeed3344a1aea088252af5f29f08a181a6"} Mar 10 15:24:28 crc kubenswrapper[4743]: I0310 15:24:28.309942 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" Mar 10 15:24:28 crc kubenswrapper[4743]: I0310 15:24:28.345439 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" podStartSLOduration=3.5572472299999998 podStartE2EDuration="19.345414501s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.64604404 +0000 UTC m=+1116.352858788" lastFinishedPulling="2026-03-10 15:24:27.434211321 +0000 UTC m=+1132.141026059" observedRunningTime="2026-03-10 15:24:28.333671624 +0000 UTC m=+1133.040486372" watchObservedRunningTime="2026-03-10 15:24:28.345414501 +0000 UTC m=+1133.052229259" Mar 10 15:24:28 crc kubenswrapper[4743]: I0310 15:24:28.354385 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" podStartSLOduration=4.540071776 podStartE2EDuration="20.354366119s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.639880083 +0000 UTC m=+1116.346694831" lastFinishedPulling="2026-03-10 15:24:27.454174426 +0000 UTC m=+1132.160989174" observedRunningTime="2026-03-10 15:24:28.350045955 +0000 UTC m=+1133.056860703" watchObservedRunningTime="2026-03-10 15:24:28.354366119 +0000 UTC m=+1133.061180867" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.178939 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bdbfx" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.280649 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-lcr6h" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.330336 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsd2m" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.356245 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-8ppkz" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.366524 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-4vj77" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.376653 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-97gbz" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.386113 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-gsvwz" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.457136 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-r4gcw" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.512377 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-vcztk" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.582115 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-s8gf8" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.617079 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-v5fw5" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.671395 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-vqpbs" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.816367 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-jfbcg" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.863801 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-s7fnn" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.930794 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-hfrrc" Mar 10 15:24:29 crc kubenswrapper[4743]: I0310 15:24:29.980747 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kmmz6" Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.376068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" event={"ID":"94a79fc9-d43a-4a8d-9d33-1bcfd7306537","Type":"ContainerStarted","Data":"30622aa5ce87fbdef51244dd7a85c188d3772342bfb408b60e6957f3dcb991b4"} Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.380057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" event={"ID":"a9eaff72-01ad-4abf-a515-92bdda950b0f","Type":"ContainerStarted","Data":"ee3cb205bf7bea3bf89dafb7e416e100e778e5f0be86c5ebe6dcfb29c052d66f"} Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.380228 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.381899 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" event={"ID":"45ac07a3-b1f8-4232-937d-6a7275aac026","Type":"ContainerStarted","Data":"748144ce95c8ec918460de3741a77b85821a8a14ec954ddb679a2a511d967077"} Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.382195 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.383971 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" event={"ID":"fa1fe66b-2fd5-40fe-8062-9fd96edd519a","Type":"ContainerStarted","Data":"fe60944720f82781f94dee9b2706e740ef2cd2a0d5ada1e645693dccc7e671f2"} Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.384169 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.401156 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmnxq" podStartSLOduration=3.566270071 podStartE2EDuration="25.401135099s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.648839851 +0000 UTC m=+1116.355654599" lastFinishedPulling="2026-03-10 15:24:33.483704879 +0000 UTC m=+1138.190519627" observedRunningTime="2026-03-10 15:24:34.394538379 +0000 UTC m=+1139.101353137" watchObservedRunningTime="2026-03-10 15:24:34.401135099 +0000 UTC m=+1139.107949857" Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.417900 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" podStartSLOduration=18.761848422 podStartE2EDuration="26.41788071s" podCreationTimestamp="2026-03-10 15:24:08 +0000 UTC" firstStartedPulling="2026-03-10 15:24:25.812939217 +0000 UTC m=+1130.519753965" lastFinishedPulling="2026-03-10 15:24:33.468971515 +0000 UTC m=+1138.175786253" observedRunningTime="2026-03-10 15:24:34.411142736 +0000 UTC m=+1139.117957494" watchObservedRunningTime="2026-03-10 15:24:34.41788071 +0000 UTC m=+1139.124695448" Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.441453 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" podStartSLOduration=17.8539261 podStartE2EDuration="25.441437338s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="2026-03-10 15:24:25.8961568 +0000 UTC m=+1130.602971548" lastFinishedPulling="2026-03-10 15:24:33.483668038 +0000 UTC m=+1138.190482786" observedRunningTime="2026-03-10 15:24:34.439116941 +0000 UTC m=+1139.145931699" watchObservedRunningTime="2026-03-10 15:24:34.441437338 +0000 UTC m=+1139.148252086" Mar 10 15:24:34 crc kubenswrapper[4743]: I0310 15:24:34.466526 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" podStartSLOduration=3.649427513 podStartE2EDuration="25.466504689s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="2026-03-10 15:24:11.634620662 +0000 UTC m=+1116.341435410" lastFinishedPulling="2026-03-10 15:24:33.451697838 +0000 UTC m=+1138.158512586" observedRunningTime="2026-03-10 15:24:34.463537454 +0000 UTC m=+1139.170352212" watchObservedRunningTime="2026-03-10 15:24:34.466504689 +0000 UTC m=+1139.173319437" Mar 10 15:24:36 crc kubenswrapper[4743]: I0310 15:24:36.752401 4743 scope.go:117] "RemoveContainer" containerID="e0c14e8af7b025d9958f803b3c3e43d97c2b88ed0721b93a673f950b69b7fab8" Mar 10 15:24:39 crc kubenswrapper[4743]: I0310 15:24:39.688243 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kn6vc" Mar 10 15:24:39 crc kubenswrapper[4743]: I0310 15:24:39.753411 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wbbgz" Mar 10 15:24:40 crc kubenswrapper[4743]: I0310 15:24:40.018389 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-4652x" Mar 10 15:24:41 crc kubenswrapper[4743]: I0310 15:24:41.777620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:41 crc kubenswrapper[4743]: I0310 15:24:41.790024 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f66984b-7e2a-4644-892d-96c8a0268ab6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-8bnxf\" (UID: \"9f66984b-7e2a-4644-892d-96c8a0268ab6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:41 crc kubenswrapper[4743]: I0310 15:24:41.944442 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-t564p" Mar 10 15:24:41 crc kubenswrapper[4743]: I0310 15:24:41.951556 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:42 crc kubenswrapper[4743]: I0310 15:24:42.466073 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf"] Mar 10 15:24:42 crc kubenswrapper[4743]: W0310 15:24:42.476118 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f66984b_7e2a_4644_892d_96c8a0268ab6.slice/crio-2e1444d999820ac4261f77e048621757a3ab1bb08bd836e3ed8b955f6ec31a8b WatchSource:0}: Error finding container 2e1444d999820ac4261f77e048621757a3ab1bb08bd836e3ed8b955f6ec31a8b: Status 404 returned error can't find the container with id 2e1444d999820ac4261f77e048621757a3ab1bb08bd836e3ed8b955f6ec31a8b Mar 10 15:24:43 crc kubenswrapper[4743]: I0310 15:24:43.458431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" event={"ID":"9f66984b-7e2a-4644-892d-96c8a0268ab6","Type":"ContainerStarted","Data":"d4a873596fe45a0a52e3cc29f23acdec66900abaf519085f8c66701862711547"} Mar 10 15:24:43 crc kubenswrapper[4743]: I0310 15:24:43.459711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" event={"ID":"9f66984b-7e2a-4644-892d-96c8a0268ab6","Type":"ContainerStarted","Data":"2e1444d999820ac4261f77e048621757a3ab1bb08bd836e3ed8b955f6ec31a8b"} Mar 10 15:24:43 crc kubenswrapper[4743]: I0310 15:24:43.459807 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:24:43 crc kubenswrapper[4743]: I0310 15:24:43.488948 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" podStartSLOduration=34.488928592 podStartE2EDuration="34.488928592s" podCreationTimestamp="2026-03-10 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:24:43.483052153 +0000 UTC m=+1148.189866911" watchObservedRunningTime="2026-03-10 15:24:43.488928592 +0000 UTC m=+1148.195743340" Mar 10 15:24:45 crc kubenswrapper[4743]: I0310 15:24:45.288377 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-cvxq6" Mar 10 15:24:45 crc kubenswrapper[4743]: I0310 15:24:45.412809 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w" Mar 10 15:24:51 crc kubenswrapper[4743]: I0310 15:24:51.964906 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-8bnxf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.480572 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlbzh"] Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.482962 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.487957 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.488020 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-m55kn" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.496148 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlbzh"] Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.534305 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6bjf"] Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.536714 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.543676 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.587753 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6bjf"] Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.605920 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46959821-7915-4614-ba68-6b0695d3caed-config\") pod \"dnsmasq-dns-675f4bcbfc-qlbzh\" (UID: \"46959821-7915-4614-ba68-6b0695d3caed\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.606055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlrdw\" (UniqueName: \"kubernetes.io/projected/46959821-7915-4614-ba68-6b0695d3caed-kube-api-access-qlrdw\") pod \"dnsmasq-dns-675f4bcbfc-qlbzh\" (UID: \"46959821-7915-4614-ba68-6b0695d3caed\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.707416 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46959821-7915-4614-ba68-6b0695d3caed-config\") pod \"dnsmasq-dns-675f4bcbfc-qlbzh\" (UID: \"46959821-7915-4614-ba68-6b0695d3caed\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.707505 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-config\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.707574 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlrdw\" (UniqueName: \"kubernetes.io/projected/46959821-7915-4614-ba68-6b0695d3caed-kube-api-access-qlrdw\") pod \"dnsmasq-dns-675f4bcbfc-qlbzh\" (UID: \"46959821-7915-4614-ba68-6b0695d3caed\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.707610 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.707669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6stfk\" (UniqueName: \"kubernetes.io/projected/8be5441f-72d5-4509-96de-163dc869ef28-kube-api-access-6stfk\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.708967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46959821-7915-4614-ba68-6b0695d3caed-config\") pod \"dnsmasq-dns-675f4bcbfc-qlbzh\" (UID: \"46959821-7915-4614-ba68-6b0695d3caed\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.737261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlrdw\" (UniqueName: \"kubernetes.io/projected/46959821-7915-4614-ba68-6b0695d3caed-kube-api-access-qlrdw\") pod \"dnsmasq-dns-675f4bcbfc-qlbzh\" (UID: \"46959821-7915-4614-ba68-6b0695d3caed\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.809005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.809058 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6stfk\" (UniqueName: \"kubernetes.io/projected/8be5441f-72d5-4509-96de-163dc869ef28-kube-api-access-6stfk\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.809132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-config\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.809917 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.810020 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-config\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.815742 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.835065 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6stfk\" (UniqueName: \"kubernetes.io/projected/8be5441f-72d5-4509-96de-163dc869ef28-kube-api-access-6stfk\") pod \"dnsmasq-dns-78dd6ddcc-f6bjf\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:10 crc kubenswrapper[4743]: I0310 15:25:10.858655 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:11 crc kubenswrapper[4743]: I0310 15:25:11.253040 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:25:11 crc kubenswrapper[4743]: I0310 15:25:11.253121 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:25:11 crc kubenswrapper[4743]: I0310 15:25:11.348561 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlbzh"] Mar 10 15:25:11 crc kubenswrapper[4743]: I0310 15:25:11.444416 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6bjf"] Mar 10 15:25:11 crc kubenswrapper[4743]: W0310 15:25:11.471044 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8be5441f_72d5_4509_96de_163dc869ef28.slice/crio-dec0eccbad97f4df4e4dbb7b8f2e638fb4e0d1d143c464d3abfe4092b7384cc8 WatchSource:0}: Error finding container dec0eccbad97f4df4e4dbb7b8f2e638fb4e0d1d143c464d3abfe4092b7384cc8: Status 404 returned error can't find the container with id dec0eccbad97f4df4e4dbb7b8f2e638fb4e0d1d143c464d3abfe4092b7384cc8 Mar 10 15:25:11 crc kubenswrapper[4743]: I0310 15:25:11.721744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" event={"ID":"8be5441f-72d5-4509-96de-163dc869ef28","Type":"ContainerStarted","Data":"dec0eccbad97f4df4e4dbb7b8f2e638fb4e0d1d143c464d3abfe4092b7384cc8"} Mar 10 15:25:11 crc kubenswrapper[4743]: I0310 15:25:11.724073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" event={"ID":"46959821-7915-4614-ba68-6b0695d3caed","Type":"ContainerStarted","Data":"b5513f6cd6f7bd97d9a95ec5299869d9c4c0d0e415b9633788edfe95674074ee"} Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.406280 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlbzh"] Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.444788 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmcx2"] Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.447661 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.476746 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmcx2"] Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.565070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-config\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.565149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2d7q\" (UniqueName: \"kubernetes.io/projected/c374aefe-a24a-4804-b91a-7ca787a73b73-kube-api-access-n2d7q\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.565273 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.666998 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2d7q\" (UniqueName: \"kubernetes.io/projected/c374aefe-a24a-4804-b91a-7ca787a73b73-kube-api-access-n2d7q\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.667122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.667149 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-config\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.668085 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-config\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.668599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.722251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2d7q\" (UniqueName: \"kubernetes.io/projected/c374aefe-a24a-4804-b91a-7ca787a73b73-kube-api-access-n2d7q\") pod \"dnsmasq-dns-5ccc8479f9-cmcx2\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.774438 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.806737 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6bjf"] Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.835573 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mk7qr"] Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.837553 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.858493 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mk7qr"] Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.976313 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.976393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-config\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:13 crc kubenswrapper[4743]: I0310 15:25:13.976461 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkts\" (UniqueName: \"kubernetes.io/projected/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-kube-api-access-qgkts\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.078763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.078848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-config\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.078896 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkts\" (UniqueName: \"kubernetes.io/projected/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-kube-api-access-qgkts\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.080319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.080558 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-config\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.105104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkts\" (UniqueName: \"kubernetes.io/projected/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-kube-api-access-qgkts\") pod \"dnsmasq-dns-57d769cc4f-mk7qr\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.234174 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.532371 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmcx2"] Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.592340 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mk7qr"] Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.645202 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.646863 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.655830 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.656125 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.656284 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.657239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.659626 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.659958 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.660006 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-bbrgl" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.665254 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.768170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" event={"ID":"ea447aa6-fdf9-4abb-a9e6-b18253f96a04","Type":"ContainerStarted","Data":"86128c020e89e40e2a39091c7fef927422d867d9520f4784bcaf7942ce6acf86"} Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.770868 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" event={"ID":"c374aefe-a24a-4804-b91a-7ca787a73b73","Type":"ContainerStarted","Data":"e8378b52419f748d68013e59db22367059b5cf1c7aea7ed7570e9ea25ec15b29"} Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.792652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.792713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.792756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.792840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/884d5267-4e85-481c-96f0-eb31b88bfe67-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.792872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.792898 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.792930 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.793038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdg2\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-kube-api-access-kvdg2\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.793063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.793090 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.793112 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/884d5267-4e85-481c-96f0-eb31b88bfe67-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894379 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/884d5267-4e85-481c-96f0-eb31b88bfe67-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894558 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894606 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/884d5267-4e85-481c-96f0-eb31b88bfe67-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894695 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.894805 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdg2\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-kube-api-access-kvdg2\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.896938 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.898394 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.898549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.899265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.899313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.899625 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.905735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.906844 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/884d5267-4e85-481c-96f0-eb31b88bfe67-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.906920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.927140 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdg2\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-kube-api-access-kvdg2\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.930151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/884d5267-4e85-481c-96f0-eb31b88bfe67-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.946350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:14 crc kubenswrapper[4743]: I0310 15:25:14.981881 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.012152 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.013646 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.020586 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.020781 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.020913 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.021061 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.021179 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.021309 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.021484 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-blfbw" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.039036 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.104797 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.104950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.104988 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.105025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.105069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.105093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.105126 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.105149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.105197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.105225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.105244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvxh2\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-kube-api-access-jvxh2\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208253 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvxh2\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-kube-api-access-jvxh2\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208316 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208686 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208978 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.208999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.209020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.209549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.209834 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.210270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.210599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.210608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.213733 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.222302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.227235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.229468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvxh2\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-kube-api-access-jvxh2\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.231561 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.248590 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.441121 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.472313 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:25:15 crc kubenswrapper[4743]: I0310 15:25:15.833092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"884d5267-4e85-481c-96f0-eb31b88bfe67","Type":"ContainerStarted","Data":"68c784d8b3a23ea6a169548af32af6550eaa28f21ef52396dc1225b0fde88700"} Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.005648 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.115182 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.116693 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.122521 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.123555 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-j6krb" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.123610 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.124529 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.140176 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.145023 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.238160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljvcv\" (UniqueName: \"kubernetes.io/projected/f7917171-7630-46e9-9ada-f7072a5fd530-kube-api-access-ljvcv\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.238242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7917171-7630-46e9-9ada-f7072a5fd530-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.238540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-kolla-config\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.238658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.238833 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7917171-7630-46e9-9ada-f7072a5fd530-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.238893 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-config-data-default\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.238921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7917171-7630-46e9-9ada-f7072a5fd530-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.239046 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7917171-7630-46e9-9ada-f7072a5fd530-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341462 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-config-data-default\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7917171-7630-46e9-9ada-f7072a5fd530-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljvcv\" (UniqueName: \"kubernetes.io/projected/f7917171-7630-46e9-9ada-f7072a5fd530-kube-api-access-ljvcv\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7917171-7630-46e9-9ada-f7072a5fd530-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341625 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-kolla-config\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.341937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7917171-7630-46e9-9ada-f7072a5fd530-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.342227 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.343464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.343638 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-config-data-default\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.343870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7917171-7630-46e9-9ada-f7072a5fd530-kolla-config\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.361451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7917171-7630-46e9-9ada-f7072a5fd530-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.361594 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljvcv\" (UniqueName: \"kubernetes.io/projected/f7917171-7630-46e9-9ada-f7072a5fd530-kube-api-access-ljvcv\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.362035 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7917171-7630-46e9-9ada-f7072a5fd530-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.369669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"f7917171-7630-46e9-9ada-f7072a5fd530\") " pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.453591 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 15:25:16 crc kubenswrapper[4743]: I0310 15:25:16.864781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7541c4b7-eda5-4cd5-b0c4-c00621726c2b","Type":"ContainerStarted","Data":"328f1248558b679ac72e90fe9427ed4f1003f7a8cbed165974c8d9973c5130e8"} Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.332263 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.334131 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.340065 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mfk9j" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.340706 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.341951 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.342355 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.349784 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.473233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.473317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.473590 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7659278c-be80-4660-a4df-a04d4a3bc888-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.473748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.473855 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7659278c-be80-4660-a4df-a04d4a3bc888-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.473892 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.474134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7659278c-be80-4660-a4df-a04d4a3bc888-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.474168 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j97sp\" (UniqueName: \"kubernetes.io/projected/7659278c-be80-4660-a4df-a04d4a3bc888-kube-api-access-j97sp\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.575452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7659278c-be80-4660-a4df-a04d4a3bc888-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.575541 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.575572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7659278c-be80-4660-a4df-a04d4a3bc888-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.575604 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.575690 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7659278c-be80-4660-a4df-a04d4a3bc888-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.575710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j97sp\" (UniqueName: \"kubernetes.io/projected/7659278c-be80-4660-a4df-a04d4a3bc888-kube-api-access-j97sp\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.575764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.575787 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.576877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.578267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.578463 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.594493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7659278c-be80-4660-a4df-a04d4a3bc888-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.596109 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7659278c-be80-4660-a4df-a04d4a3bc888-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.605131 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7659278c-be80-4660-a4df-a04d4a3bc888-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.605843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7659278c-be80-4660-a4df-a04d4a3bc888-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.628280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j97sp\" (UniqueName: \"kubernetes.io/projected/7659278c-be80-4660-a4df-a04d4a3bc888-kube-api-access-j97sp\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.644183 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7659278c-be80-4660-a4df-a04d4a3bc888\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.681287 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.745191 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.747187 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.758498 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.758829 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.758976 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-5428v" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.760740 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.894267 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/532a024a-c011-4684-bff0-7d91932d8895-kolla-config\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.894417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/532a024a-c011-4684-bff0-7d91932d8895-memcached-tls-certs\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.894463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/532a024a-c011-4684-bff0-7d91932d8895-config-data\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.894517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m8ft\" (UniqueName: \"kubernetes.io/projected/532a024a-c011-4684-bff0-7d91932d8895-kube-api-access-8m8ft\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.894562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532a024a-c011-4684-bff0-7d91932d8895-combined-ca-bundle\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.998643 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532a024a-c011-4684-bff0-7d91932d8895-combined-ca-bundle\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:17 crc kubenswrapper[4743]: I0310 15:25:17.999374 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/532a024a-c011-4684-bff0-7d91932d8895-kolla-config\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.000134 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/532a024a-c011-4684-bff0-7d91932d8895-memcached-tls-certs\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.000202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/532a024a-c011-4684-bff0-7d91932d8895-config-data\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.000285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m8ft\" (UniqueName: \"kubernetes.io/projected/532a024a-c011-4684-bff0-7d91932d8895-kube-api-access-8m8ft\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.000543 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/532a024a-c011-4684-bff0-7d91932d8895-kolla-config\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.001411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/532a024a-c011-4684-bff0-7d91932d8895-config-data\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.010701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/532a024a-c011-4684-bff0-7d91932d8895-memcached-tls-certs\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.020532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532a024a-c011-4684-bff0-7d91932d8895-combined-ca-bundle\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.025168 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m8ft\" (UniqueName: \"kubernetes.io/projected/532a024a-c011-4684-bff0-7d91932d8895-kube-api-access-8m8ft\") pod \"memcached-0\" (UID: \"532a024a-c011-4684-bff0-7d91932d8895\") " pod="openstack/memcached-0" Mar 10 15:25:18 crc kubenswrapper[4743]: I0310 15:25:18.074550 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 15:25:20 crc kubenswrapper[4743]: I0310 15:25:20.355698 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:25:20 crc kubenswrapper[4743]: I0310 15:25:20.364968 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:25:20 crc kubenswrapper[4743]: I0310 15:25:20.368149 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-7rxfn" Mar 10 15:25:20 crc kubenswrapper[4743]: I0310 15:25:20.371784 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:25:20 crc kubenswrapper[4743]: I0310 15:25:20.456402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghbz\" (UniqueName: \"kubernetes.io/projected/26e1af4e-d1d8-4e2f-b5ca-917dfca9906c-kube-api-access-pghbz\") pod \"kube-state-metrics-0\" (UID: \"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c\") " pod="openstack/kube-state-metrics-0" Mar 10 15:25:20 crc kubenswrapper[4743]: I0310 15:25:20.560375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pghbz\" (UniqueName: \"kubernetes.io/projected/26e1af4e-d1d8-4e2f-b5ca-917dfca9906c-kube-api-access-pghbz\") pod \"kube-state-metrics-0\" (UID: \"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c\") " pod="openstack/kube-state-metrics-0" Mar 10 15:25:20 crc kubenswrapper[4743]: I0310 15:25:20.585881 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghbz\" (UniqueName: \"kubernetes.io/projected/26e1af4e-d1d8-4e2f-b5ca-917dfca9906c-kube-api-access-pghbz\") pod \"kube-state-metrics-0\" (UID: \"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c\") " pod="openstack/kube-state-metrics-0" Mar 10 15:25:20 crc kubenswrapper[4743]: I0310 15:25:20.736617 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.625703 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x7xkr"] Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.627301 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.630886 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6nt74" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.631143 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.637751 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2xslw"] Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.639932 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.641846 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.653047 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7xkr"] Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.663413 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2xslw"] Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q78hn\" (UniqueName: \"kubernetes.io/projected/bcb12ceb-d1b1-4c62-a718-602c0070e84c-kube-api-access-q78hn\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scb5k\" (UniqueName: \"kubernetes.io/projected/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-kube-api-access-scb5k\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713601 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-ovn-controller-tls-certs\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-lib\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-scripts\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-run\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-combined-ca-bundle\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713799 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-etc-ovs\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713852 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-run-ovn\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713868 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-log-ovn\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713884 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcb12ceb-d1b1-4c62-a718-602c0070e84c-scripts\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713902 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-run\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.713921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-log\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816034 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q78hn\" (UniqueName: \"kubernetes.io/projected/bcb12ceb-d1b1-4c62-a718-602c0070e84c-kube-api-access-q78hn\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scb5k\" (UniqueName: \"kubernetes.io/projected/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-kube-api-access-scb5k\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-ovn-controller-tls-certs\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816159 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-lib\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816215 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-scripts\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-run\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-combined-ca-bundle\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816301 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-etc-ovs\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816333 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-run-ovn\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-log-ovn\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcb12ceb-d1b1-4c62-a718-602c0070e84c-scripts\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816395 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-run\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816417 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-log\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.816945 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-log\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.817055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-etc-ovs\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.817116 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-lib\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.817289 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bcb12ceb-d1b1-4c62-a718-602c0070e84c-var-run\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.817346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-run-ovn\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.817369 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-run\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.817616 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-var-log-ovn\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.819396 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcb12ceb-d1b1-4c62-a718-602c0070e84c-scripts\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.819709 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-scripts\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.826483 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-ovn-controller-tls-certs\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.831687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-combined-ca-bundle\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.835266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scb5k\" (UniqueName: \"kubernetes.io/projected/b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64-kube-api-access-scb5k\") pod \"ovn-controller-x7xkr\" (UID: \"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64\") " pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.837448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q78hn\" (UniqueName: \"kubernetes.io/projected/bcb12ceb-d1b1-4c62-a718-602c0070e84c-kube-api-access-q78hn\") pod \"ovn-controller-ovs-2xslw\" (UID: \"bcb12ceb-d1b1-4c62-a718-602c0070e84c\") " pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.949558 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.958083 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.959556 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.959609 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.963987 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.964391 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.964560 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.964420 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.972000 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 15:25:23 crc kubenswrapper[4743]: I0310 15:25:23.987184 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wdz9s" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.019941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.020043 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.020076 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.020136 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d73b4e03-5996-41fd-8450-63b8de9e9f2e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.020171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d73b4e03-5996-41fd-8450-63b8de9e9f2e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.020244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28p5x\" (UniqueName: \"kubernetes.io/projected/d73b4e03-5996-41fd-8450-63b8de9e9f2e-kube-api-access-28p5x\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.020327 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73b4e03-5996-41fd-8450-63b8de9e9f2e-config\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.020378 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.121766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.121918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.121967 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d73b4e03-5996-41fd-8450-63b8de9e9f2e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.122010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d73b4e03-5996-41fd-8450-63b8de9e9f2e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.122065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28p5x\" (UniqueName: \"kubernetes.io/projected/d73b4e03-5996-41fd-8450-63b8de9e9f2e-kube-api-access-28p5x\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.122130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73b4e03-5996-41fd-8450-63b8de9e9f2e-config\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.122168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.122196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.122806 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.124071 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d73b4e03-5996-41fd-8450-63b8de9e9f2e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.124464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d73b4e03-5996-41fd-8450-63b8de9e9f2e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.125106 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73b4e03-5996-41fd-8450-63b8de9e9f2e-config\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.127280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.133210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.134909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73b4e03-5996-41fd-8450-63b8de9e9f2e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.145488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.147740 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28p5x\" (UniqueName: \"kubernetes.io/projected/d73b4e03-5996-41fd-8450-63b8de9e9f2e-kube-api-access-28p5x\") pod \"ovsdbserver-nb-0\" (UID: \"d73b4e03-5996-41fd-8450-63b8de9e9f2e\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:24 crc kubenswrapper[4743]: I0310 15:25:24.348113 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.876610 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.879212 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.888209 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.888617 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ljp57" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.888767 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.888943 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.911632 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.974073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.974128 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.974160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.974392 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-config\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.974499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.974678 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbdv5\" (UniqueName: \"kubernetes.io/projected/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-kube-api-access-dbdv5\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.974853 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:26 crc kubenswrapper[4743]: I0310 15:25:26.974913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076712 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-config\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076862 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbdv5\" (UniqueName: \"kubernetes.io/projected/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-kube-api-access-dbdv5\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076926 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.076956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.077409 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.077978 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.078027 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-config\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.083607 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.083716 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.084578 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.096262 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbdv5\" (UniqueName: \"kubernetes.io/projected/688394f8-79dd-4922-9a2c-fbd9ffbb3a28-kube-api-access-dbdv5\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.105061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"688394f8-79dd-4922-9a2c-fbd9ffbb3a28\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:27 crc kubenswrapper[4743]: I0310 15:25:27.219039 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.511531 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.512235 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n2d7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-cmcx2_openstack(c374aefe-a24a-4804-b91a-7ca787a73b73): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.513907 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" podUID="c374aefe-a24a-4804-b91a-7ca787a73b73" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.536263 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.536454 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6stfk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-f6bjf_openstack(8be5441f-72d5-4509-96de-163dc869ef28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.536696 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.536781 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgkts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-mk7qr_openstack(ea447aa6-fdf9-4abb-a9e6-b18253f96a04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.538122 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" podUID="8be5441f-72d5-4509-96de-163dc869ef28" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.538708 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" podUID="ea447aa6-fdf9-4abb-a9e6-b18253f96a04" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.561153 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.561313 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qlrdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-qlbzh_openstack(46959821-7915-4614-ba68-6b0695d3caed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:25:33 crc kubenswrapper[4743]: E0310 15:25:33.562833 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" podUID="46959821-7915-4614-ba68-6b0695d3caed" Mar 10 15:25:34 crc kubenswrapper[4743]: E0310 15:25:34.020925 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" podUID="c374aefe-a24a-4804-b91a-7ca787a73b73" Mar 10 15:25:34 crc kubenswrapper[4743]: E0310 15:25:34.021035 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" podUID="ea447aa6-fdf9-4abb-a9e6-b18253f96a04" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.225066 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.239287 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.271232 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.547764 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2xslw"] Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.554375 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.565144 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7xkr"] Mar 10 15:25:34 crc kubenswrapper[4743]: W0310 15:25:34.566985 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcb12ceb_d1b1_4c62_a718_602c0070e84c.slice/crio-7ed692a041dda3140545e42c89e643518ed606551fb69286b6ab7c90cac6a446 WatchSource:0}: Error finding container 7ed692a041dda3140545e42c89e643518ed606551fb69286b6ab7c90cac6a446: Status 404 returned error can't find the container with id 7ed692a041dda3140545e42c89e643518ed606551fb69286b6ab7c90cac6a446 Mar 10 15:25:34 crc kubenswrapper[4743]: W0310 15:25:34.569053 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7917171_7630_46e9_9ada_f7072a5fd530.slice/crio-9dfc44da6c07bad8a39084c62d0187ff0589bb77f7b7758e1afa82d9c8adca0d WatchSource:0}: Error finding container 9dfc44da6c07bad8a39084c62d0187ff0589bb77f7b7758e1afa82d9c8adca0d: Status 404 returned error can't find the container with id 9dfc44da6c07bad8a39084c62d0187ff0589bb77f7b7758e1afa82d9c8adca0d Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.657276 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.728439 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.735766 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.828967 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6stfk\" (UniqueName: \"kubernetes.io/projected/8be5441f-72d5-4509-96de-163dc869ef28-kube-api-access-6stfk\") pod \"8be5441f-72d5-4509-96de-163dc869ef28\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.829976 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-config\") pod \"8be5441f-72d5-4509-96de-163dc869ef28\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.830054 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-dns-svc\") pod \"8be5441f-72d5-4509-96de-163dc869ef28\" (UID: \"8be5441f-72d5-4509-96de-163dc869ef28\") " Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.830728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-config" (OuterVolumeSpecName: "config") pod "8be5441f-72d5-4509-96de-163dc869ef28" (UID: "8be5441f-72d5-4509-96de-163dc869ef28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.830769 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8be5441f-72d5-4509-96de-163dc869ef28" (UID: "8be5441f-72d5-4509-96de-163dc869ef28"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.846270 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be5441f-72d5-4509-96de-163dc869ef28-kube-api-access-6stfk" (OuterVolumeSpecName: "kube-api-access-6stfk") pod "8be5441f-72d5-4509-96de-163dc869ef28" (UID: "8be5441f-72d5-4509-96de-163dc869ef28"). InnerVolumeSpecName "kube-api-access-6stfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.931717 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46959821-7915-4614-ba68-6b0695d3caed-config\") pod \"46959821-7915-4614-ba68-6b0695d3caed\" (UID: \"46959821-7915-4614-ba68-6b0695d3caed\") " Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.931769 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlrdw\" (UniqueName: \"kubernetes.io/projected/46959821-7915-4614-ba68-6b0695d3caed-kube-api-access-qlrdw\") pod \"46959821-7915-4614-ba68-6b0695d3caed\" (UID: \"46959821-7915-4614-ba68-6b0695d3caed\") " Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.932109 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.932137 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be5441f-72d5-4509-96de-163dc869ef28-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.932148 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6stfk\" (UniqueName: \"kubernetes.io/projected/8be5441f-72d5-4509-96de-163dc869ef28-kube-api-access-6stfk\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.932265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46959821-7915-4614-ba68-6b0695d3caed-config" (OuterVolumeSpecName: "config") pod "46959821-7915-4614-ba68-6b0695d3caed" (UID: "46959821-7915-4614-ba68-6b0695d3caed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:34 crc kubenswrapper[4743]: I0310 15:25:34.966432 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46959821-7915-4614-ba68-6b0695d3caed-kube-api-access-qlrdw" (OuterVolumeSpecName: "kube-api-access-qlrdw") pod "46959821-7915-4614-ba68-6b0695d3caed" (UID: "46959821-7915-4614-ba68-6b0695d3caed"). InnerVolumeSpecName "kube-api-access-qlrdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.026859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7659278c-be80-4660-a4df-a04d4a3bc888","Type":"ContainerStarted","Data":"ef63db455817abc4c3c961fbee1f95cd0f3942e5970b15bae631510cf7439e5f"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.028760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"532a024a-c011-4684-bff0-7d91932d8895","Type":"ContainerStarted","Data":"698907d8264689cdd14073fa1e7abc304ec7d6ca533fa02a2183cce15e0aebb9"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.029920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"688394f8-79dd-4922-9a2c-fbd9ffbb3a28","Type":"ContainerStarted","Data":"c4dd83799d597dc84e5f0912a00190913288d1f026dc82be6d52ca78cc13bd23"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.031033 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" event={"ID":"8be5441f-72d5-4509-96de-163dc869ef28","Type":"ContainerDied","Data":"dec0eccbad97f4df4e4dbb7b8f2e638fb4e0d1d143c464d3abfe4092b7384cc8"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.031099 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-f6bjf" Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.033264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f7917171-7630-46e9-9ada-f7072a5fd530","Type":"ContainerStarted","Data":"9dfc44da6c07bad8a39084c62d0187ff0589bb77f7b7758e1afa82d9c8adca0d"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.034672 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c","Type":"ContainerStarted","Data":"c9578c10055819a51ad2192fbdb2aa312c1e09a37fa8084681cb02dd770ae8a5"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.034953 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlrdw\" (UniqueName: \"kubernetes.io/projected/46959821-7915-4614-ba68-6b0695d3caed-kube-api-access-qlrdw\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.034991 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46959821-7915-4614-ba68-6b0695d3caed-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.037367 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" event={"ID":"46959821-7915-4614-ba68-6b0695d3caed","Type":"ContainerDied","Data":"b5513f6cd6f7bd97d9a95ec5299869d9c4c0d0e415b9633788edfe95674074ee"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.037475 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qlbzh" Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.038736 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7xkr" event={"ID":"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64","Type":"ContainerStarted","Data":"c51e0026bd280023bff1e9cdee7d698948760d1a4c7b37048e6d0ad3317ce320"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.040125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2xslw" event={"ID":"bcb12ceb-d1b1-4c62-a718-602c0070e84c","Type":"ContainerStarted","Data":"7ed692a041dda3140545e42c89e643518ed606551fb69286b6ab7c90cac6a446"} Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.188709 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6bjf"] Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.197308 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6bjf"] Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.214069 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlbzh"] Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.220962 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlbzh"] Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.616480 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.930749 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46959821-7915-4614-ba68-6b0695d3caed" path="/var/lib/kubelet/pods/46959821-7915-4614-ba68-6b0695d3caed/volumes" Mar 10 15:25:35 crc kubenswrapper[4743]: I0310 15:25:35.931198 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be5441f-72d5-4509-96de-163dc869ef28" path="/var/lib/kubelet/pods/8be5441f-72d5-4509-96de-163dc869ef28/volumes" Mar 10 15:25:36 crc kubenswrapper[4743]: I0310 15:25:36.053014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7541c4b7-eda5-4cd5-b0c4-c00621726c2b","Type":"ContainerStarted","Data":"290b8969232645596b41011b8e17a74e528a28bee07b274203a8b937c2e4b355"} Mar 10 15:25:36 crc kubenswrapper[4743]: I0310 15:25:36.056608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"884d5267-4e85-481c-96f0-eb31b88bfe67","Type":"ContainerStarted","Data":"642a6c5de3968f103baa5bbf0937b99b8102f35959c7bfcd09d2ddb3d9449e34"} Mar 10 15:25:36 crc kubenswrapper[4743]: W0310 15:25:36.898054 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd73b4e03_5996_41fd_8450_63b8de9e9f2e.slice/crio-fc3c0283fe35f80bd50d52bb54b314b583e4217e1b0627cdf1765747a5d67e6b WatchSource:0}: Error finding container fc3c0283fe35f80bd50d52bb54b314b583e4217e1b0627cdf1765747a5d67e6b: Status 404 returned error can't find the container with id fc3c0283fe35f80bd50d52bb54b314b583e4217e1b0627cdf1765747a5d67e6b Mar 10 15:25:37 crc kubenswrapper[4743]: I0310 15:25:37.068858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d73b4e03-5996-41fd-8450-63b8de9e9f2e","Type":"ContainerStarted","Data":"fc3c0283fe35f80bd50d52bb54b314b583e4217e1b0627cdf1765747a5d67e6b"} Mar 10 15:25:41 crc kubenswrapper[4743]: I0310 15:25:41.252847 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:25:41 crc kubenswrapper[4743]: I0310 15:25:41.253445 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:25:42 crc kubenswrapper[4743]: I0310 15:25:42.106695 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7xkr" event={"ID":"b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64","Type":"ContainerStarted","Data":"1821d82811dacfb1d3f1f3120e90e0da09faf8260caa899ab7943b92057cf22e"} Mar 10 15:25:42 crc kubenswrapper[4743]: I0310 15:25:42.108616 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-x7xkr" Mar 10 15:25:42 crc kubenswrapper[4743]: I0310 15:25:42.133459 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-x7xkr" podStartSLOduration=12.587310233 podStartE2EDuration="19.133437445s" podCreationTimestamp="2026-03-10 15:25:23 +0000 UTC" firstStartedPulling="2026-03-10 15:25:34.578485175 +0000 UTC m=+1199.285299923" lastFinishedPulling="2026-03-10 15:25:41.124612397 +0000 UTC m=+1205.831427135" observedRunningTime="2026-03-10 15:25:42.12526215 +0000 UTC m=+1206.832076898" watchObservedRunningTime="2026-03-10 15:25:42.133437445 +0000 UTC m=+1206.840252193" Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.122133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c","Type":"ContainerStarted","Data":"c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd"} Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.122489 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.124884 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"688394f8-79dd-4922-9a2c-fbd9ffbb3a28","Type":"ContainerStarted","Data":"cfee45ce184f064dc34d4345364ee5ebad14c7d7708696e7965268fd7ac19bdd"} Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.127173 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f7917171-7630-46e9-9ada-f7072a5fd530","Type":"ContainerStarted","Data":"a1c77f29631f2dc36a40a8083aa9343762bbfaa6002d9a78a49ab9e4d3d4949b"} Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.131337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7659278c-be80-4660-a4df-a04d4a3bc888","Type":"ContainerStarted","Data":"82ca1a675a533aa6e44372586722436d356906f4e758a2e2e0219cc0ccce2799"} Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.135969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d73b4e03-5996-41fd-8450-63b8de9e9f2e","Type":"ContainerStarted","Data":"89c3413e9a322395f82ef7cac6f9485c215e229c8b62d2e18ec659d3f40197d8"} Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.147628 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.364483484 podStartE2EDuration="23.147608528s" podCreationTimestamp="2026-03-10 15:25:20 +0000 UTC" firstStartedPulling="2026-03-10 15:25:34.239090302 +0000 UTC m=+1198.945905060" lastFinishedPulling="2026-03-10 15:25:42.022215356 +0000 UTC m=+1206.729030104" observedRunningTime="2026-03-10 15:25:43.13795709 +0000 UTC m=+1207.844771858" watchObservedRunningTime="2026-03-10 15:25:43.147608528 +0000 UTC m=+1207.854423296" Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.149372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2xslw" event={"ID":"bcb12ceb-d1b1-4c62-a718-602c0070e84c","Type":"ContainerStarted","Data":"0604b748a617d20a210d79a438eb882a19a2dd54b44c497744536bf59065b93c"} Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.154740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"532a024a-c011-4684-bff0-7d91932d8895","Type":"ContainerStarted","Data":"d27ea3aeba485a4b58aad977468e6635b9cf82a526596ccd3c600ec81ed084f1"} Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.155005 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 15:25:43 crc kubenswrapper[4743]: I0310 15:25:43.248839 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.047894417 podStartE2EDuration="26.248803679s" podCreationTimestamp="2026-03-10 15:25:17 +0000 UTC" firstStartedPulling="2026-03-10 15:25:34.263053902 +0000 UTC m=+1198.969868650" lastFinishedPulling="2026-03-10 15:25:40.463963164 +0000 UTC m=+1205.170777912" observedRunningTime="2026-03-10 15:25:43.247267485 +0000 UTC m=+1207.954082233" watchObservedRunningTime="2026-03-10 15:25:43.248803679 +0000 UTC m=+1207.955618447" Mar 10 15:25:44 crc kubenswrapper[4743]: I0310 15:25:44.165992 4743 generic.go:334] "Generic (PLEG): container finished" podID="bcb12ceb-d1b1-4c62-a718-602c0070e84c" containerID="0604b748a617d20a210d79a438eb882a19a2dd54b44c497744536bf59065b93c" exitCode=0 Mar 10 15:25:44 crc kubenswrapper[4743]: I0310 15:25:44.166162 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2xslw" event={"ID":"bcb12ceb-d1b1-4c62-a718-602c0070e84c","Type":"ContainerDied","Data":"0604b748a617d20a210d79a438eb882a19a2dd54b44c497744536bf59065b93c"} Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.186879 4743 generic.go:334] "Generic (PLEG): container finished" podID="f7917171-7630-46e9-9ada-f7072a5fd530" containerID="a1c77f29631f2dc36a40a8083aa9343762bbfaa6002d9a78a49ab9e4d3d4949b" exitCode=0 Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.186955 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f7917171-7630-46e9-9ada-f7072a5fd530","Type":"ContainerDied","Data":"a1c77f29631f2dc36a40a8083aa9343762bbfaa6002d9a78a49ab9e4d3d4949b"} Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.195154 4743 generic.go:334] "Generic (PLEG): container finished" podID="7659278c-be80-4660-a4df-a04d4a3bc888" containerID="82ca1a675a533aa6e44372586722436d356906f4e758a2e2e0219cc0ccce2799" exitCode=0 Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.195267 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7659278c-be80-4660-a4df-a04d4a3bc888","Type":"ContainerDied","Data":"82ca1a675a533aa6e44372586722436d356906f4e758a2e2e0219cc0ccce2799"} Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.198314 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d73b4e03-5996-41fd-8450-63b8de9e9f2e","Type":"ContainerStarted","Data":"596b6aceaca59fbf1b752d8d25f648b73955ea9230cf929eea0c94e8c6e74011"} Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.201249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2xslw" event={"ID":"bcb12ceb-d1b1-4c62-a718-602c0070e84c","Type":"ContainerStarted","Data":"a1a668444e28f798e7bb94b36c6684f0feda353ed3dfa0d4f56ebb39176e9a43"} Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.205485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"688394f8-79dd-4922-9a2c-fbd9ffbb3a28","Type":"ContainerStarted","Data":"12632dcb86532391743686e6c4075e7c19bdd0d35813211f8d7f460c54289578"} Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.259265 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.249149395 podStartE2EDuration="21.2592194s" podCreationTimestamp="2026-03-10 15:25:25 +0000 UTC" firstStartedPulling="2026-03-10 15:25:34.659787093 +0000 UTC m=+1199.366601841" lastFinishedPulling="2026-03-10 15:25:45.669857098 +0000 UTC m=+1210.376671846" observedRunningTime="2026-03-10 15:25:46.257432149 +0000 UTC m=+1210.964246897" watchObservedRunningTime="2026-03-10 15:25:46.2592194 +0000 UTC m=+1210.966034138" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.344803 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.560511189 podStartE2EDuration="24.344775371s" podCreationTimestamp="2026-03-10 15:25:22 +0000 UTC" firstStartedPulling="2026-03-10 15:25:36.901247786 +0000 UTC m=+1201.608062534" lastFinishedPulling="2026-03-10 15:25:45.685511938 +0000 UTC m=+1210.392326716" observedRunningTime="2026-03-10 15:25:46.334178206 +0000 UTC m=+1211.040992954" watchObservedRunningTime="2026-03-10 15:25:46.344775371 +0000 UTC m=+1211.051590119" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.919005 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-djcxl"] Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.920882 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.926518 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.944647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-djcxl"] Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.973145 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f838c354-e379-48b7-8c2e-e295ec8c135b-combined-ca-bundle\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.973296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f838c354-e379-48b7-8c2e-e295ec8c135b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.973417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftsbg\" (UniqueName: \"kubernetes.io/projected/f838c354-e379-48b7-8c2e-e295ec8c135b-kube-api-access-ftsbg\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.973480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f838c354-e379-48b7-8c2e-e295ec8c135b-config\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.973604 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f838c354-e379-48b7-8c2e-e295ec8c135b-ovn-rundir\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:46 crc kubenswrapper[4743]: I0310 15:25:46.973736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f838c354-e379-48b7-8c2e-e295ec8c135b-ovs-rundir\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.075094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f838c354-e379-48b7-8c2e-e295ec8c135b-combined-ca-bundle\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.075417 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f838c354-e379-48b7-8c2e-e295ec8c135b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.075533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftsbg\" (UniqueName: \"kubernetes.io/projected/f838c354-e379-48b7-8c2e-e295ec8c135b-kube-api-access-ftsbg\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.075679 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f838c354-e379-48b7-8c2e-e295ec8c135b-config\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.075827 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f838c354-e379-48b7-8c2e-e295ec8c135b-ovn-rundir\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.075931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f838c354-e379-48b7-8c2e-e295ec8c135b-ovs-rundir\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.076344 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f838c354-e379-48b7-8c2e-e295ec8c135b-ovs-rundir\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.076566 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f838c354-e379-48b7-8c2e-e295ec8c135b-ovn-rundir\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.077179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f838c354-e379-48b7-8c2e-e295ec8c135b-config\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.086608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f838c354-e379-48b7-8c2e-e295ec8c135b-combined-ca-bundle\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.095092 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f838c354-e379-48b7-8c2e-e295ec8c135b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.097339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftsbg\" (UniqueName: \"kubernetes.io/projected/f838c354-e379-48b7-8c2e-e295ec8c135b-kube-api-access-ftsbg\") pod \"ovn-controller-metrics-djcxl\" (UID: \"f838c354-e379-48b7-8c2e-e295ec8c135b\") " pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.119084 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mk7qr"] Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.164457 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-7bklp"] Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.166117 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.177008 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.185336 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-7bklp"] Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.219916 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.222557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2xslw" event={"ID":"bcb12ceb-d1b1-4c62-a718-602c0070e84c","Type":"ContainerStarted","Data":"35dabc6238f56dbeddac90749ca6d4feafc60da92fe1202b0c862c698e92c656"} Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.223656 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.223687 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.229212 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f7917171-7630-46e9-9ada-f7072a5fd530","Type":"ContainerStarted","Data":"1b8431164d6d23246f87546a0b151dbb69336ec56d7cbca6051f77e7d336af02"} Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.234788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7659278c-be80-4660-a4df-a04d4a3bc888","Type":"ContainerStarted","Data":"c08262d9c36c3be4490566dec8f644c1210340092735ed2c33716d668e823cae"} Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.249064 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djcxl" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.253080 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2xslw" podStartSLOduration=17.534069004 podStartE2EDuration="24.253056818s" podCreationTimestamp="2026-03-10 15:25:23 +0000 UTC" firstStartedPulling="2026-03-10 15:25:34.570621319 +0000 UTC m=+1199.277436067" lastFinishedPulling="2026-03-10 15:25:41.289609133 +0000 UTC m=+1205.996423881" observedRunningTime="2026-03-10 15:25:47.24513166 +0000 UTC m=+1211.951946408" watchObservedRunningTime="2026-03-10 15:25:47.253056818 +0000 UTC m=+1211.959871566" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.280356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.280431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dl76\" (UniqueName: \"kubernetes.io/projected/0ac45631-2e26-4153-b619-9608c84a4580-kube-api-access-2dl76\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.280478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.280524 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-config\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.292506 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.243851565 podStartE2EDuration="31.292476892s" podCreationTimestamp="2026-03-10 15:25:16 +0000 UTC" firstStartedPulling="2026-03-10 15:25:34.241275195 +0000 UTC m=+1198.948089943" lastFinishedPulling="2026-03-10 15:25:41.289900532 +0000 UTC m=+1205.996715270" observedRunningTime="2026-03-10 15:25:47.284314637 +0000 UTC m=+1211.991129385" watchObservedRunningTime="2026-03-10 15:25:47.292476892 +0000 UTC m=+1211.999291640" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.321852 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.678172538 podStartE2EDuration="32.321824566s" podCreationTimestamp="2026-03-10 15:25:15 +0000 UTC" firstStartedPulling="2026-03-10 15:25:34.572191864 +0000 UTC m=+1199.279006612" lastFinishedPulling="2026-03-10 15:25:41.215843892 +0000 UTC m=+1205.922658640" observedRunningTime="2026-03-10 15:25:47.317016418 +0000 UTC m=+1212.023831166" watchObservedRunningTime="2026-03-10 15:25:47.321824566 +0000 UTC m=+1212.028639314" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.364133 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmcx2"] Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.381998 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.387396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dl76\" (UniqueName: \"kubernetes.io/projected/0ac45631-2e26-4153-b619-9608c84a4580-kube-api-access-2dl76\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.387612 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.387783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-config\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.383298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.388410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.388851 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-config\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.395749 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-z6kh5"] Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.397144 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.406701 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.422180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dl76\" (UniqueName: \"kubernetes.io/projected/0ac45631-2e26-4153-b619-9608c84a4580-kube-api-access-2dl76\") pod \"dnsmasq-dns-6bc7876d45-7bklp\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.443841 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z6kh5"] Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.489268 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.489453 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.489564 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-config\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.489927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-dns-svc\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.489954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvvn\" (UniqueName: \"kubernetes.io/projected/f7975923-a8ea-4269-b285-6fa5a4d639ad-kube-api-access-2wvvn\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.493319 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.593772 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.593851 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.593885 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-config\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.593965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-dns-svc\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.593987 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvvn\" (UniqueName: \"kubernetes.io/projected/f7975923-a8ea-4269-b285-6fa5a4d639ad-kube-api-access-2wvvn\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.594835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.594912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-config\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.595274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.595593 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-dns-svc\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.610481 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvvn\" (UniqueName: \"kubernetes.io/projected/f7975923-a8ea-4269-b285-6fa5a4d639ad-kube-api-access-2wvvn\") pod \"dnsmasq-dns-8554648995-z6kh5\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.682241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.682278 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.726784 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.825256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-djcxl"] Mar 10 15:25:47 crc kubenswrapper[4743]: I0310 15:25:47.949231 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-7bklp"] Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.077151 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 15:25:48 crc kubenswrapper[4743]: W0310 15:25:48.092011 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ac45631_2e26_4153_b619_9608c84a4580.slice/crio-46963c51b50ebb237e5661592e51c3079fd276d3638bf36647119e92f4841fb0 WatchSource:0}: Error finding container 46963c51b50ebb237e5661592e51c3079fd276d3638bf36647119e92f4841fb0: Status 404 returned error can't find the container with id 46963c51b50ebb237e5661592e51c3079fd276d3638bf36647119e92f4841fb0 Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.198773 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z6kh5"] Mar 10 15:25:48 crc kubenswrapper[4743]: W0310 15:25:48.212882 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7975923_a8ea_4269_b285_6fa5a4d639ad.slice/crio-635b6cd8676edbf55b6db584d99304b8d6a45b516a039971fdd760f8f0a61c2b WatchSource:0}: Error finding container 635b6cd8676edbf55b6db584d99304b8d6a45b516a039971fdd760f8f0a61c2b: Status 404 returned error can't find the container with id 635b6cd8676edbf55b6db584d99304b8d6a45b516a039971fdd760f8f0a61c2b Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.221207 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.250662 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" event={"ID":"0ac45631-2e26-4153-b619-9608c84a4580","Type":"ContainerStarted","Data":"46963c51b50ebb237e5661592e51c3079fd276d3638bf36647119e92f4841fb0"} Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.252438 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z6kh5" event={"ID":"f7975923-a8ea-4269-b285-6fa5a4d639ad","Type":"ContainerStarted","Data":"635b6cd8676edbf55b6db584d99304b8d6a45b516a039971fdd760f8f0a61c2b"} Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.255355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-djcxl" event={"ID":"f838c354-e379-48b7-8c2e-e295ec8c135b","Type":"ContainerStarted","Data":"fcacbe0eff2b363aaba6d60491d93afb284c45c6408ee75228207450069d3e87"} Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.255382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-djcxl" event={"ID":"f838c354-e379-48b7-8c2e-e295ec8c135b","Type":"ContainerStarted","Data":"106caaf5847d80191ed61f6d9e30fe194e5a70a8982fab15b1b66474579a2db1"} Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.259869 4743 generic.go:334] "Generic (PLEG): container finished" podID="ea447aa6-fdf9-4abb-a9e6-b18253f96a04" containerID="92353b0f2cc341be26f2152847d276612cd973eaf606ff415cf32915769f5741" exitCode=0 Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.259918 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" event={"ID":"ea447aa6-fdf9-4abb-a9e6-b18253f96a04","Type":"ContainerDied","Data":"92353b0f2cc341be26f2152847d276612cd973eaf606ff415cf32915769f5741"} Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.263202 4743 generic.go:334] "Generic (PLEG): container finished" podID="c374aefe-a24a-4804-b91a-7ca787a73b73" containerID="2e37933441aa988a4b8655d9b6e06362d72d7ce6fe8b0ca2d9661859b1a5016c" exitCode=0 Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.263363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" event={"ID":"c374aefe-a24a-4804-b91a-7ca787a73b73","Type":"ContainerDied","Data":"2e37933441aa988a4b8655d9b6e06362d72d7ce6fe8b0ca2d9661859b1a5016c"} Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.276971 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-djcxl" podStartSLOduration=2.276928778 podStartE2EDuration="2.276928778s" podCreationTimestamp="2026-03-10 15:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:48.275386464 +0000 UTC m=+1212.982201212" watchObservedRunningTime="2026-03-10 15:25:48.276928778 +0000 UTC m=+1212.983743516" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.342418 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.348341 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.431390 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.737637 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.742892 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.834574 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2d7q\" (UniqueName: \"kubernetes.io/projected/c374aefe-a24a-4804-b91a-7ca787a73b73-kube-api-access-n2d7q\") pod \"c374aefe-a24a-4804-b91a-7ca787a73b73\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.834698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-config\") pod \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.834771 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-config\") pod \"c374aefe-a24a-4804-b91a-7ca787a73b73\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.834934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-dns-svc\") pod \"c374aefe-a24a-4804-b91a-7ca787a73b73\" (UID: \"c374aefe-a24a-4804-b91a-7ca787a73b73\") " Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.834983 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-dns-svc\") pod \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.835087 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgkts\" (UniqueName: \"kubernetes.io/projected/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-kube-api-access-qgkts\") pod \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\" (UID: \"ea447aa6-fdf9-4abb-a9e6-b18253f96a04\") " Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.843395 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c374aefe-a24a-4804-b91a-7ca787a73b73-kube-api-access-n2d7q" (OuterVolumeSpecName: "kube-api-access-n2d7q") pod "c374aefe-a24a-4804-b91a-7ca787a73b73" (UID: "c374aefe-a24a-4804-b91a-7ca787a73b73"). InnerVolumeSpecName "kube-api-access-n2d7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.859102 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-kube-api-access-qgkts" (OuterVolumeSpecName: "kube-api-access-qgkts") pod "ea447aa6-fdf9-4abb-a9e6-b18253f96a04" (UID: "ea447aa6-fdf9-4abb-a9e6-b18253f96a04"). InnerVolumeSpecName "kube-api-access-qgkts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.861114 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-config" (OuterVolumeSpecName: "config") pod "ea447aa6-fdf9-4abb-a9e6-b18253f96a04" (UID: "ea447aa6-fdf9-4abb-a9e6-b18253f96a04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.863377 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea447aa6-fdf9-4abb-a9e6-b18253f96a04" (UID: "ea447aa6-fdf9-4abb-a9e6-b18253f96a04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.869609 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-config" (OuterVolumeSpecName: "config") pod "c374aefe-a24a-4804-b91a-7ca787a73b73" (UID: "c374aefe-a24a-4804-b91a-7ca787a73b73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.890791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c374aefe-a24a-4804-b91a-7ca787a73b73" (UID: "c374aefe-a24a-4804-b91a-7ca787a73b73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.937195 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.937594 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgkts\" (UniqueName: \"kubernetes.io/projected/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-kube-api-access-qgkts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.937611 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2d7q\" (UniqueName: \"kubernetes.io/projected/c374aefe-a24a-4804-b91a-7ca787a73b73-kube-api-access-n2d7q\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.937620 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea447aa6-fdf9-4abb-a9e6-b18253f96a04-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.937633 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:48 crc kubenswrapper[4743]: I0310 15:25:48.937644 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c374aefe-a24a-4804-b91a-7ca787a73b73-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.272832 4743 generic.go:334] "Generic (PLEG): container finished" podID="f7975923-a8ea-4269-b285-6fa5a4d639ad" containerID="d0134f01d6d634c3742660c7f3f08906becbb9d03ab4b52225a868c555cb9d58" exitCode=0 Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.272883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z6kh5" event={"ID":"f7975923-a8ea-4269-b285-6fa5a4d639ad","Type":"ContainerDied","Data":"d0134f01d6d634c3742660c7f3f08906becbb9d03ab4b52225a868c555cb9d58"} Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.278043 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.278071 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mk7qr" event={"ID":"ea447aa6-fdf9-4abb-a9e6-b18253f96a04","Type":"ContainerDied","Data":"86128c020e89e40e2a39091c7fef927422d867d9520f4784bcaf7942ce6acf86"} Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.278255 4743 scope.go:117] "RemoveContainer" containerID="92353b0f2cc341be26f2152847d276612cd973eaf606ff415cf32915769f5741" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.280418 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" event={"ID":"c374aefe-a24a-4804-b91a-7ca787a73b73","Type":"ContainerDied","Data":"e8378b52419f748d68013e59db22367059b5cf1c7aea7ed7570e9ea25ec15b29"} Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.280518 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-cmcx2" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.283354 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ac45631-2e26-4153-b619-9608c84a4580" containerID="399f54163fabd1ff5332b7389886435af00c10caf05730e95e181fd89f931bf0" exitCode=0 Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.283567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" event={"ID":"0ac45631-2e26-4153-b619-9608c84a4580","Type":"ContainerDied","Data":"399f54163fabd1ff5332b7389886435af00c10caf05730e95e181fd89f931bf0"} Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.284875 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.303629 4743 scope.go:117] "RemoveContainer" containerID="2e37933441aa988a4b8655d9b6e06362d72d7ce6fe8b0ca2d9661859b1a5016c" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.344596 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.348727 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.404464 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmcx2"] Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.423159 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-cmcx2"] Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.466936 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mk7qr"] Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.472468 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mk7qr"] Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.745388 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 15:25:49 crc kubenswrapper[4743]: E0310 15:25:49.746287 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c374aefe-a24a-4804-b91a-7ca787a73b73" containerName="init" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.746320 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c374aefe-a24a-4804-b91a-7ca787a73b73" containerName="init" Mar 10 15:25:49 crc kubenswrapper[4743]: E0310 15:25:49.746358 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea447aa6-fdf9-4abb-a9e6-b18253f96a04" containerName="init" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.746373 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea447aa6-fdf9-4abb-a9e6-b18253f96a04" containerName="init" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.746628 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c374aefe-a24a-4804-b91a-7ca787a73b73" containerName="init" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.746657 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea447aa6-fdf9-4abb-a9e6-b18253f96a04" containerName="init" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.749736 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.753840 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qtxtw" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.754097 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.754143 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.763830 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.772528 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.859400 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3431d016-101b-4513-8027-33805ae14fce-config\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.859468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3431d016-101b-4513-8027-33805ae14fce-scripts\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.859519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3431d016-101b-4513-8027-33805ae14fce-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.859578 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.859611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.859643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xbd\" (UniqueName: \"kubernetes.io/projected/3431d016-101b-4513-8027-33805ae14fce-kube-api-access-n9xbd\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.859673 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.928032 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c374aefe-a24a-4804-b91a-7ca787a73b73" path="/var/lib/kubelet/pods/c374aefe-a24a-4804-b91a-7ca787a73b73/volumes" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.928704 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea447aa6-fdf9-4abb-a9e6-b18253f96a04" path="/var/lib/kubelet/pods/ea447aa6-fdf9-4abb-a9e6-b18253f96a04/volumes" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.961350 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3431d016-101b-4513-8027-33805ae14fce-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.961996 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.962029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.962054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xbd\" (UniqueName: \"kubernetes.io/projected/3431d016-101b-4513-8027-33805ae14fce-kube-api-access-n9xbd\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.962075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.962401 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3431d016-101b-4513-8027-33805ae14fce-config\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.962474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3431d016-101b-4513-8027-33805ae14fce-scripts\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.961902 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3431d016-101b-4513-8027-33805ae14fce-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.963966 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3431d016-101b-4513-8027-33805ae14fce-config\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.964067 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3431d016-101b-4513-8027-33805ae14fce-scripts\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.969186 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.971726 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.971739 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3431d016-101b-4513-8027-33805ae14fce-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:49 crc kubenswrapper[4743]: I0310 15:25:49.993967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xbd\" (UniqueName: \"kubernetes.io/projected/3431d016-101b-4513-8027-33805ae14fce-kube-api-access-n9xbd\") pod \"ovn-northd-0\" (UID: \"3431d016-101b-4513-8027-33805ae14fce\") " pod="openstack/ovn-northd-0" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.081538 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.297838 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" event={"ID":"0ac45631-2e26-4153-b619-9608c84a4580","Type":"ContainerStarted","Data":"643e0eaf6393337a55c76fe936ba5a61e07b46ec07820f912bd71632580e8eec"} Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.298269 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.312353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z6kh5" event={"ID":"f7975923-a8ea-4269-b285-6fa5a4d639ad","Type":"ContainerStarted","Data":"da168f79a7b207b9ee1d34d14329d77acf028437bdbecf46287f05dd97913ccc"} Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.312725 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.328052 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" podStartSLOduration=3.328030227 podStartE2EDuration="3.328030227s" podCreationTimestamp="2026-03-10 15:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:50.325484784 +0000 UTC m=+1215.032299532" watchObservedRunningTime="2026-03-10 15:25:50.328030227 +0000 UTC m=+1215.034844975" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.352158 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-z6kh5" podStartSLOduration=3.35213792 podStartE2EDuration="3.35213792s" podCreationTimestamp="2026-03-10 15:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:50.346163918 +0000 UTC m=+1215.052978676" watchObservedRunningTime="2026-03-10 15:25:50.35213792 +0000 UTC m=+1215.058952668" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.549313 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.767708 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.876018 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-7bklp"] Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.897969 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5js7q"] Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.899342 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.912351 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5js7q"] Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.980418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.980458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x59c\" (UniqueName: \"kubernetes.io/projected/016d978b-7540-425c-8328-75d43cf9f042-kube-api-access-6x59c\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.980482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.980519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-config\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:50 crc kubenswrapper[4743]: I0310 15:25:50.980544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.082648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.082697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x59c\" (UniqueName: \"kubernetes.io/projected/016d978b-7540-425c-8328-75d43cf9f042-kube-api-access-6x59c\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.082719 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.082751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-config\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.082776 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.083643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.083670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.083837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.083907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-config\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.107768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x59c\" (UniqueName: \"kubernetes.io/projected/016d978b-7540-425c-8328-75d43cf9f042-kube-api-access-6x59c\") pod \"dnsmasq-dns-b8fbc5445-5js7q\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.218793 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.329167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3431d016-101b-4513-8027-33805ae14fce","Type":"ContainerStarted","Data":"5f01ea96cafdd8eefd3311bff0b8305d1c76f6219ca73444d8e742f44db712f4"} Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.698714 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5js7q"] Mar 10 15:25:51 crc kubenswrapper[4743]: W0310 15:25:51.703794 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod016d978b_7540_425c_8328_75d43cf9f042.slice/crio-ebccc5eb5d3459e1341c23c5f9ff3ee45c2f85eb7d5275702d70ea7e26210e46 WatchSource:0}: Error finding container ebccc5eb5d3459e1341c23c5f9ff3ee45c2f85eb7d5275702d70ea7e26210e46: Status 404 returned error can't find the container with id ebccc5eb5d3459e1341c23c5f9ff3ee45c2f85eb7d5275702d70ea7e26210e46 Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.979631 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.987528 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.989756 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.990381 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.990391 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 15:25:51 crc kubenswrapper[4743]: I0310 15:25:51.991327 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6fgpl" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.014511 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.106784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05770cd2-4275-4fcc-bd98-f8951c4d91ba-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.107199 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/05770cd2-4275-4fcc-bd98-f8951c4d91ba-lock\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.107271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.107304 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx724\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-kube-api-access-tx724\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.107337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/05770cd2-4275-4fcc-bd98-f8951c4d91ba-cache\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.107605 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.209088 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/05770cd2-4275-4fcc-bd98-f8951c4d91ba-cache\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.209218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.209276 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05770cd2-4275-4fcc-bd98-f8951c4d91ba-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.209319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/05770cd2-4275-4fcc-bd98-f8951c4d91ba-lock\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.209377 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.209422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx724\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-kube-api-access-tx724\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: E0310 15:25:52.209668 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:52 crc kubenswrapper[4743]: E0310 15:25:52.209711 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.209736 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/05770cd2-4275-4fcc-bd98-f8951c4d91ba-cache\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: E0310 15:25:52.209782 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift podName:05770cd2-4275-4fcc-bd98-f8951c4d91ba nodeName:}" failed. No retries permitted until 2026-03-10 15:25:52.709753592 +0000 UTC m=+1217.416568410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift") pod "swift-storage-0" (UID: "05770cd2-4275-4fcc-bd98-f8951c4d91ba") : configmap "swift-ring-files" not found Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.209960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/05770cd2-4275-4fcc-bd98-f8951c4d91ba-lock\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.210092 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.217450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05770cd2-4275-4fcc-bd98-f8951c4d91ba-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.242670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx724\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-kube-api-access-tx724\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.243925 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.251769 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-lbfhc"] Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.253475 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.256801 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.257044 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.257173 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.265278 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lbfhc"] Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.311366 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-scripts\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.311856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-combined-ca-bundle\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.312014 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f2a6755-0e08-482b-9815-88840f35fb4e-etc-swift\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.312141 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm7xp\" (UniqueName: \"kubernetes.io/projected/1f2a6755-0e08-482b-9815-88840f35fb4e-kube-api-access-pm7xp\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.312294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-dispersionconf\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.312390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-ring-data-devices\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.312558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-swiftconf\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.336212 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" event={"ID":"016d978b-7540-425c-8328-75d43cf9f042","Type":"ContainerStarted","Data":"ebccc5eb5d3459e1341c23c5f9ff3ee45c2f85eb7d5275702d70ea7e26210e46"} Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.337889 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" podUID="0ac45631-2e26-4153-b619-9608c84a4580" containerName="dnsmasq-dns" containerID="cri-o://643e0eaf6393337a55c76fe936ba5a61e07b46ec07820f912bd71632580e8eec" gracePeriod=10 Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.413921 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-scripts\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.414245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-combined-ca-bundle\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.414365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f2a6755-0e08-482b-9815-88840f35fb4e-etc-swift\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.414524 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm7xp\" (UniqueName: \"kubernetes.io/projected/1f2a6755-0e08-482b-9815-88840f35fb4e-kube-api-access-pm7xp\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.414972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-dispersionconf\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.415128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-ring-data-devices\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.419033 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-swiftconf\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.414916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f2a6755-0e08-482b-9815-88840f35fb4e-etc-swift\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.416168 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-ring-data-devices\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.414806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-scripts\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.419714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-combined-ca-bundle\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.419937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-dispersionconf\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.431575 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-swiftconf\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.441625 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm7xp\" (UniqueName: \"kubernetes.io/projected/1f2a6755-0e08-482b-9815-88840f35fb4e-kube-api-access-pm7xp\") pod \"swift-ring-rebalance-lbfhc\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.616494 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:25:52 crc kubenswrapper[4743]: I0310 15:25:52.723287 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:52 crc kubenswrapper[4743]: E0310 15:25:52.723488 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:52 crc kubenswrapper[4743]: E0310 15:25:52.723735 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:52 crc kubenswrapper[4743]: E0310 15:25:52.723802 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift podName:05770cd2-4275-4fcc-bd98-f8951c4d91ba nodeName:}" failed. No retries permitted until 2026-03-10 15:25:53.723780138 +0000 UTC m=+1218.430594886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift") pod "swift-storage-0" (UID: "05770cd2-4275-4fcc-bd98-f8951c4d91ba") : configmap "swift-ring-files" not found Mar 10 15:25:53 crc kubenswrapper[4743]: I0310 15:25:53.345612 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ac45631-2e26-4153-b619-9608c84a4580" containerID="643e0eaf6393337a55c76fe936ba5a61e07b46ec07820f912bd71632580e8eec" exitCode=0 Mar 10 15:25:53 crc kubenswrapper[4743]: I0310 15:25:53.345656 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" event={"ID":"0ac45631-2e26-4153-b619-9608c84a4580","Type":"ContainerDied","Data":"643e0eaf6393337a55c76fe936ba5a61e07b46ec07820f912bd71632580e8eec"} Mar 10 15:25:53 crc kubenswrapper[4743]: I0310 15:25:53.740451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:53 crc kubenswrapper[4743]: E0310 15:25:53.740660 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:53 crc kubenswrapper[4743]: E0310 15:25:53.740845 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:53 crc kubenswrapper[4743]: E0310 15:25:53.740897 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift podName:05770cd2-4275-4fcc-bd98-f8951c4d91ba nodeName:}" failed. No retries permitted until 2026-03-10 15:25:55.740882634 +0000 UTC m=+1220.447697382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift") pod "swift-storage-0" (UID: "05770cd2-4275-4fcc-bd98-f8951c4d91ba") : configmap "swift-ring-files" not found Mar 10 15:25:54 crc kubenswrapper[4743]: I0310 15:25:54.343785 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lbfhc"] Mar 10 15:25:54 crc kubenswrapper[4743]: I0310 15:25:54.860960 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:54 crc kubenswrapper[4743]: I0310 15:25:54.965401 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dl76\" (UniqueName: \"kubernetes.io/projected/0ac45631-2e26-4153-b619-9608c84a4580-kube-api-access-2dl76\") pod \"0ac45631-2e26-4153-b619-9608c84a4580\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " Mar 10 15:25:54 crc kubenswrapper[4743]: I0310 15:25:54.965457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-config\") pod \"0ac45631-2e26-4153-b619-9608c84a4580\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " Mar 10 15:25:54 crc kubenswrapper[4743]: I0310 15:25:54.965603 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-ovsdbserver-sb\") pod \"0ac45631-2e26-4153-b619-9608c84a4580\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " Mar 10 15:25:54 crc kubenswrapper[4743]: I0310 15:25:54.965665 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-dns-svc\") pod \"0ac45631-2e26-4153-b619-9608c84a4580\" (UID: \"0ac45631-2e26-4153-b619-9608c84a4580\") " Mar 10 15:25:54 crc kubenswrapper[4743]: I0310 15:25:54.975106 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac45631-2e26-4153-b619-9608c84a4580-kube-api-access-2dl76" (OuterVolumeSpecName: "kube-api-access-2dl76") pod "0ac45631-2e26-4153-b619-9608c84a4580" (UID: "0ac45631-2e26-4153-b619-9608c84a4580"). InnerVolumeSpecName "kube-api-access-2dl76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.017268 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ac45631-2e26-4153-b619-9608c84a4580" (UID: "0ac45631-2e26-4153-b619-9608c84a4580"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.018665 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ac45631-2e26-4153-b619-9608c84a4580" (UID: "0ac45631-2e26-4153-b619-9608c84a4580"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.026831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-config" (OuterVolumeSpecName: "config") pod "0ac45631-2e26-4153-b619-9608c84a4580" (UID: "0ac45631-2e26-4153-b619-9608c84a4580"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.067769 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dl76\" (UniqueName: \"kubernetes.io/projected/0ac45631-2e26-4153-b619-9608c84a4580-kube-api-access-2dl76\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.067826 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.067841 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.067853 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac45631-2e26-4153-b619-9608c84a4580-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.363103 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" event={"ID":"0ac45631-2e26-4153-b619-9608c84a4580","Type":"ContainerDied","Data":"46963c51b50ebb237e5661592e51c3079fd276d3638bf36647119e92f4841fb0"} Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.363399 4743 scope.go:117] "RemoveContainer" containerID="643e0eaf6393337a55c76fe936ba5a61e07b46ec07820f912bd71632580e8eec" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.363154 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-7bklp" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.364946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lbfhc" event={"ID":"1f2a6755-0e08-482b-9815-88840f35fb4e","Type":"ContainerStarted","Data":"e18380f6b59496e7450bba73db26cf34ecdb304026cbda48df1f6f60277232da"} Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.368509 4743 generic.go:334] "Generic (PLEG): container finished" podID="016d978b-7540-425c-8328-75d43cf9f042" containerID="54a0e0020ddd816943bdd70bab2e6d0987c76359b2a5a1fca94a254609a4b372" exitCode=0 Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.368557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" event={"ID":"016d978b-7540-425c-8328-75d43cf9f042","Type":"ContainerDied","Data":"54a0e0020ddd816943bdd70bab2e6d0987c76359b2a5a1fca94a254609a4b372"} Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.394376 4743 scope.go:117] "RemoveContainer" containerID="399f54163fabd1ff5332b7389886435af00c10caf05730e95e181fd89f931bf0" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.434364 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-7bklp"] Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.434425 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-7bklp"] Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.782331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:55 crc kubenswrapper[4743]: E0310 15:25:55.782547 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:55 crc kubenswrapper[4743]: E0310 15:25:55.782564 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:55 crc kubenswrapper[4743]: E0310 15:25:55.782609 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift podName:05770cd2-4275-4fcc-bd98-f8951c4d91ba nodeName:}" failed. No retries permitted until 2026-03-10 15:25:59.782596571 +0000 UTC m=+1224.489411319 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift") pod "swift-storage-0" (UID: "05770cd2-4275-4fcc-bd98-f8951c4d91ba") : configmap "swift-ring-files" not found Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.828094 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.929310 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac45631-2e26-4153-b619-9608c84a4580" path="/var/lib/kubelet/pods/0ac45631-2e26-4153-b619-9608c84a4580/volumes" Mar 10 15:25:55 crc kubenswrapper[4743]: I0310 15:25:55.930036 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.384121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3431d016-101b-4513-8027-33805ae14fce","Type":"ContainerStarted","Data":"e9d4b3e538665dadddb25a79ac9379033593364017d7e32081177c4770a33a0e"} Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.422962 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ksg46"] Mar 10 15:25:56 crc kubenswrapper[4743]: E0310 15:25:56.423522 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac45631-2e26-4153-b619-9608c84a4580" containerName="init" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.423545 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac45631-2e26-4153-b619-9608c84a4580" containerName="init" Mar 10 15:25:56 crc kubenswrapper[4743]: E0310 15:25:56.423582 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac45631-2e26-4153-b619-9608c84a4580" containerName="dnsmasq-dns" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.423591 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac45631-2e26-4153-b619-9608c84a4580" containerName="dnsmasq-dns" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.423856 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac45631-2e26-4153-b619-9608c84a4580" containerName="dnsmasq-dns" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.424572 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ksg46" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.432472 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.448278 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ksg46"] Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.455796 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.455857 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.498774 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2094d4dd-213a-4db3-9768-321112175d9a-operator-scripts\") pod \"root-account-create-update-ksg46\" (UID: \"2094d4dd-213a-4db3-9768-321112175d9a\") " pod="openstack/root-account-create-update-ksg46" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.499263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wfw7\" (UniqueName: \"kubernetes.io/projected/2094d4dd-213a-4db3-9768-321112175d9a-kube-api-access-8wfw7\") pod \"root-account-create-update-ksg46\" (UID: \"2094d4dd-213a-4db3-9768-321112175d9a\") " pod="openstack/root-account-create-update-ksg46" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.545267 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.602092 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2094d4dd-213a-4db3-9768-321112175d9a-operator-scripts\") pod \"root-account-create-update-ksg46\" (UID: \"2094d4dd-213a-4db3-9768-321112175d9a\") " pod="openstack/root-account-create-update-ksg46" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.602161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wfw7\" (UniqueName: \"kubernetes.io/projected/2094d4dd-213a-4db3-9768-321112175d9a-kube-api-access-8wfw7\") pod \"root-account-create-update-ksg46\" (UID: \"2094d4dd-213a-4db3-9768-321112175d9a\") " pod="openstack/root-account-create-update-ksg46" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.603041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2094d4dd-213a-4db3-9768-321112175d9a-operator-scripts\") pod \"root-account-create-update-ksg46\" (UID: \"2094d4dd-213a-4db3-9768-321112175d9a\") " pod="openstack/root-account-create-update-ksg46" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.621855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wfw7\" (UniqueName: \"kubernetes.io/projected/2094d4dd-213a-4db3-9768-321112175d9a-kube-api-access-8wfw7\") pod \"root-account-create-update-ksg46\" (UID: \"2094d4dd-213a-4db3-9768-321112175d9a\") " pod="openstack/root-account-create-update-ksg46" Mar 10 15:25:56 crc kubenswrapper[4743]: I0310 15:25:56.753525 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ksg46" Mar 10 15:25:57 crc kubenswrapper[4743]: I0310 15:25:57.479383 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 15:25:57 crc kubenswrapper[4743]: I0310 15:25:57.736304 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.237405 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8kq9l"] Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.238554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8kq9l" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.248726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gcwn\" (UniqueName: \"kubernetes.io/projected/7bc89d4c-f23b-4795-bfcd-96318a266339-kube-api-access-6gcwn\") pod \"glance-db-create-8kq9l\" (UID: \"7bc89d4c-f23b-4795-bfcd-96318a266339\") " pod="openstack/glance-db-create-8kq9l" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.248862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc89d4c-f23b-4795-bfcd-96318a266339-operator-scripts\") pod \"glance-db-create-8kq9l\" (UID: \"7bc89d4c-f23b-4795-bfcd-96318a266339\") " pod="openstack/glance-db-create-8kq9l" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.269042 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8kq9l"] Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.308022 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ksg46"] Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.350292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gcwn\" (UniqueName: \"kubernetes.io/projected/7bc89d4c-f23b-4795-bfcd-96318a266339-kube-api-access-6gcwn\") pod \"glance-db-create-8kq9l\" (UID: \"7bc89d4c-f23b-4795-bfcd-96318a266339\") " pod="openstack/glance-db-create-8kq9l" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.350359 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc89d4c-f23b-4795-bfcd-96318a266339-operator-scripts\") pod \"glance-db-create-8kq9l\" (UID: \"7bc89d4c-f23b-4795-bfcd-96318a266339\") " pod="openstack/glance-db-create-8kq9l" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.351204 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc89d4c-f23b-4795-bfcd-96318a266339-operator-scripts\") pod \"glance-db-create-8kq9l\" (UID: \"7bc89d4c-f23b-4795-bfcd-96318a266339\") " pod="openstack/glance-db-create-8kq9l" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.358032 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a335-account-create-update-bh55z"] Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.359134 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.361408 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.370953 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a335-account-create-update-bh55z"] Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.381797 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gcwn\" (UniqueName: \"kubernetes.io/projected/7bc89d4c-f23b-4795-bfcd-96318a266339-kube-api-access-6gcwn\") pod \"glance-db-create-8kq9l\" (UID: \"7bc89d4c-f23b-4795-bfcd-96318a266339\") " pod="openstack/glance-db-create-8kq9l" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.408808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3431d016-101b-4513-8027-33805ae14fce","Type":"ContainerStarted","Data":"67fa4ceba5f414a770450b22fe664c0bc61bd0baac2d98ae50b89005b690209d"} Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.410595 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.413845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lbfhc" event={"ID":"1f2a6755-0e08-482b-9815-88840f35fb4e","Type":"ContainerStarted","Data":"9c7363e95d1deab266462107dd5f529c84ce7d11509454f9342c7b22d584e622"} Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.417485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" event={"ID":"016d978b-7540-425c-8328-75d43cf9f042","Type":"ContainerStarted","Data":"4190571d4cd4869dcbfd420bfa52f9a27dc08138e66d9e3a4ad68baa16ae9bfe"} Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.417533 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.420097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ksg46" event={"ID":"2094d4dd-213a-4db3-9768-321112175d9a","Type":"ContainerStarted","Data":"d3dd5c2b1ebeda048b51347648ad23f4d15d77e89fae17d97f40b15bf57454ae"} Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.438352 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.77933462 podStartE2EDuration="9.438334901s" podCreationTimestamp="2026-03-10 15:25:49 +0000 UTC" firstStartedPulling="2026-03-10 15:25:50.558710362 +0000 UTC m=+1215.265525110" lastFinishedPulling="2026-03-10 15:25:55.217710643 +0000 UTC m=+1219.924525391" observedRunningTime="2026-03-10 15:25:58.431761292 +0000 UTC m=+1223.138576040" watchObservedRunningTime="2026-03-10 15:25:58.438334901 +0000 UTC m=+1223.145149639" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.460636 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" podStartSLOduration=8.460609071 podStartE2EDuration="8.460609071s" podCreationTimestamp="2026-03-10 15:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:58.450447759 +0000 UTC m=+1223.157262507" watchObservedRunningTime="2026-03-10 15:25:58.460609071 +0000 UTC m=+1223.167423839" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.472137 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-lbfhc" podStartSLOduration=3.074606127 podStartE2EDuration="6.472116203s" podCreationTimestamp="2026-03-10 15:25:52 +0000 UTC" firstStartedPulling="2026-03-10 15:25:54.357631034 +0000 UTC m=+1219.064445782" lastFinishedPulling="2026-03-10 15:25:57.75514111 +0000 UTC m=+1222.461955858" observedRunningTime="2026-03-10 15:25:58.468575941 +0000 UTC m=+1223.175390699" watchObservedRunningTime="2026-03-10 15:25:58.472116203 +0000 UTC m=+1223.178930951" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.554020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-operator-scripts\") pod \"glance-a335-account-create-update-bh55z\" (UID: \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\") " pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.554291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9cvb\" (UniqueName: \"kubernetes.io/projected/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-kube-api-access-t9cvb\") pod \"glance-a335-account-create-update-bh55z\" (UID: \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\") " pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.575794 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8kq9l" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.656040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-operator-scripts\") pod \"glance-a335-account-create-update-bh55z\" (UID: \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\") " pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.656118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9cvb\" (UniqueName: \"kubernetes.io/projected/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-kube-api-access-t9cvb\") pod \"glance-a335-account-create-update-bh55z\" (UID: \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\") " pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.657070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-operator-scripts\") pod \"glance-a335-account-create-update-bh55z\" (UID: \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\") " pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.679572 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9cvb\" (UniqueName: \"kubernetes.io/projected/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-kube-api-access-t9cvb\") pod \"glance-a335-account-create-update-bh55z\" (UID: \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\") " pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:25:58 crc kubenswrapper[4743]: I0310 15:25:58.976566 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.047693 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8kq9l"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.129291 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vvwgs"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.130925 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vvwgs" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.162136 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vvwgs"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.198426 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8746-account-create-update-v57t2"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.199913 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.202087 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.205765 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8746-account-create-update-v57t2"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.268849 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp4pg\" (UniqueName: \"kubernetes.io/projected/b8c53268-9723-45cc-b49b-068b245ea223-kube-api-access-fp4pg\") pod \"keystone-db-create-vvwgs\" (UID: \"b8c53268-9723-45cc-b49b-068b245ea223\") " pod="openstack/keystone-db-create-vvwgs" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.269177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c53268-9723-45cc-b49b-068b245ea223-operator-scripts\") pod \"keystone-db-create-vvwgs\" (UID: \"b8c53268-9723-45cc-b49b-068b245ea223\") " pod="openstack/keystone-db-create-vvwgs" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.371584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5g7p\" (UniqueName: \"kubernetes.io/projected/a74cb34e-ac59-473e-aa30-2c148a81e0ea-kube-api-access-f5g7p\") pod \"keystone-8746-account-create-update-v57t2\" (UID: \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\") " pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.371688 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74cb34e-ac59-473e-aa30-2c148a81e0ea-operator-scripts\") pod \"keystone-8746-account-create-update-v57t2\" (UID: \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\") " pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.371783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp4pg\" (UniqueName: \"kubernetes.io/projected/b8c53268-9723-45cc-b49b-068b245ea223-kube-api-access-fp4pg\") pod \"keystone-db-create-vvwgs\" (UID: \"b8c53268-9723-45cc-b49b-068b245ea223\") " pod="openstack/keystone-db-create-vvwgs" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.371857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c53268-9723-45cc-b49b-068b245ea223-operator-scripts\") pod \"keystone-db-create-vvwgs\" (UID: \"b8c53268-9723-45cc-b49b-068b245ea223\") " pod="openstack/keystone-db-create-vvwgs" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.372556 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c53268-9723-45cc-b49b-068b245ea223-operator-scripts\") pod \"keystone-db-create-vvwgs\" (UID: \"b8c53268-9723-45cc-b49b-068b245ea223\") " pod="openstack/keystone-db-create-vvwgs" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.373633 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-bz2n7"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.375036 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bz2n7" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.392840 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bz2n7"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.399489 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp4pg\" (UniqueName: \"kubernetes.io/projected/b8c53268-9723-45cc-b49b-068b245ea223-kube-api-access-fp4pg\") pod \"keystone-db-create-vvwgs\" (UID: \"b8c53268-9723-45cc-b49b-068b245ea223\") " pod="openstack/keystone-db-create-vvwgs" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.429491 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8kq9l" event={"ID":"7bc89d4c-f23b-4795-bfcd-96318a266339","Type":"ContainerStarted","Data":"ef22e1193204539d41cdc97b630e58390c221111525c11cb1d646bfcc7d278b1"} Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.429542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8kq9l" event={"ID":"7bc89d4c-f23b-4795-bfcd-96318a266339","Type":"ContainerStarted","Data":"872109b12e75bf41b9f18cebd3d61722869aa47e496631444f348e81f398826a"} Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.435306 4743 generic.go:334] "Generic (PLEG): container finished" podID="2094d4dd-213a-4db3-9768-321112175d9a" containerID="85961ca2d9d955f64b34fa59578fe2c676024233d93b9fc00272972bd27d3b40" exitCode=0 Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.435484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ksg46" event={"ID":"2094d4dd-213a-4db3-9768-321112175d9a","Type":"ContainerDied","Data":"85961ca2d9d955f64b34fa59578fe2c676024233d93b9fc00272972bd27d3b40"} Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.469129 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-8kq9l" podStartSLOduration=1.46911105 podStartE2EDuration="1.46911105s" podCreationTimestamp="2026-03-10 15:25:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:59.453846701 +0000 UTC m=+1224.160661469" watchObservedRunningTime="2026-03-10 15:25:59.46911105 +0000 UTC m=+1224.175925798" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.473066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74cb34e-ac59-473e-aa30-2c148a81e0ea-operator-scripts\") pod \"keystone-8746-account-create-update-v57t2\" (UID: \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\") " pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.473234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm856\" (UniqueName: \"kubernetes.io/projected/c5f8c087-774e-49b4-b660-d3c4d9b72061-kube-api-access-cm856\") pod \"placement-db-create-bz2n7\" (UID: \"c5f8c087-774e-49b4-b660-d3c4d9b72061\") " pod="openstack/placement-db-create-bz2n7" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.473289 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5g7p\" (UniqueName: \"kubernetes.io/projected/a74cb34e-ac59-473e-aa30-2c148a81e0ea-kube-api-access-f5g7p\") pod \"keystone-8746-account-create-update-v57t2\" (UID: \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\") " pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.473442 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f8c087-774e-49b4-b660-d3c4d9b72061-operator-scripts\") pod \"placement-db-create-bz2n7\" (UID: \"c5f8c087-774e-49b4-b660-d3c4d9b72061\") " pod="openstack/placement-db-create-bz2n7" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.473996 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74cb34e-ac59-473e-aa30-2c148a81e0ea-operator-scripts\") pod \"keystone-8746-account-create-update-v57t2\" (UID: \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\") " pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.476115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vvwgs" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.500445 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5g7p\" (UniqueName: \"kubernetes.io/projected/a74cb34e-ac59-473e-aa30-2c148a81e0ea-kube-api-access-f5g7p\") pod \"keystone-8746-account-create-update-v57t2\" (UID: \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\") " pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.502491 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ada4-account-create-update-vgjvq"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.511359 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.516771 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.529567 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.533119 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ada4-account-create-update-vgjvq"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.545647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a335-account-create-update-bh55z"] Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.575511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm856\" (UniqueName: \"kubernetes.io/projected/c5f8c087-774e-49b4-b660-d3c4d9b72061-kube-api-access-cm856\") pod \"placement-db-create-bz2n7\" (UID: \"c5f8c087-774e-49b4-b660-d3c4d9b72061\") " pod="openstack/placement-db-create-bz2n7" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.575761 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f8c087-774e-49b4-b660-d3c4d9b72061-operator-scripts\") pod \"placement-db-create-bz2n7\" (UID: \"c5f8c087-774e-49b4-b660-d3c4d9b72061\") " pod="openstack/placement-db-create-bz2n7" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.577615 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f8c087-774e-49b4-b660-d3c4d9b72061-operator-scripts\") pod \"placement-db-create-bz2n7\" (UID: \"c5f8c087-774e-49b4-b660-d3c4d9b72061\") " pod="openstack/placement-db-create-bz2n7" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.599066 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm856\" (UniqueName: \"kubernetes.io/projected/c5f8c087-774e-49b4-b660-d3c4d9b72061-kube-api-access-cm856\") pod \"placement-db-create-bz2n7\" (UID: \"c5f8c087-774e-49b4-b660-d3c4d9b72061\") " pod="openstack/placement-db-create-bz2n7" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.678751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cbf843-671d-45e8-8844-f1c0c3d9e099-operator-scripts\") pod \"placement-ada4-account-create-update-vgjvq\" (UID: \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\") " pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.678839 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6xz\" (UniqueName: \"kubernetes.io/projected/e6cbf843-671d-45e8-8844-f1c0c3d9e099-kube-api-access-zv6xz\") pod \"placement-ada4-account-create-update-vgjvq\" (UID: \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\") " pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.698035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bz2n7" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.780936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cbf843-671d-45e8-8844-f1c0c3d9e099-operator-scripts\") pod \"placement-ada4-account-create-update-vgjvq\" (UID: \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\") " pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.781012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6xz\" (UniqueName: \"kubernetes.io/projected/e6cbf843-671d-45e8-8844-f1c0c3d9e099-kube-api-access-zv6xz\") pod \"placement-ada4-account-create-update-vgjvq\" (UID: \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\") " pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.781662 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cbf843-671d-45e8-8844-f1c0c3d9e099-operator-scripts\") pod \"placement-ada4-account-create-update-vgjvq\" (UID: \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\") " pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.800891 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6xz\" (UniqueName: \"kubernetes.io/projected/e6cbf843-671d-45e8-8844-f1c0c3d9e099-kube-api-access-zv6xz\") pod \"placement-ada4-account-create-update-vgjvq\" (UID: \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\") " pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.845350 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.882772 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:25:59 crc kubenswrapper[4743]: E0310 15:25:59.882969 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:59 crc kubenswrapper[4743]: E0310 15:25:59.882983 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:59 crc kubenswrapper[4743]: E0310 15:25:59.883024 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift podName:05770cd2-4275-4fcc-bd98-f8951c4d91ba nodeName:}" failed. No retries permitted until 2026-03-10 15:26:07.883011036 +0000 UTC m=+1232.589825784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift") pod "swift-storage-0" (UID: "05770cd2-4275-4fcc-bd98-f8951c4d91ba") : configmap "swift-ring-files" not found Mar 10 15:25:59 crc kubenswrapper[4743]: I0310 15:25:59.990848 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vvwgs"] Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.141562 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552606-qc86p"] Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.142699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552606-qc86p" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.145423 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.145762 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.145903 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.151933 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552606-qc86p"] Mar 10 15:26:00 crc kubenswrapper[4743]: W0310 15:26:00.182784 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda74cb34e_ac59_473e_aa30_2c148a81e0ea.slice/crio-bef637fcaef84d268257dd4bd5d1e5e2cae7848e00281a13b86eb2ef07f57588 WatchSource:0}: Error finding container bef637fcaef84d268257dd4bd5d1e5e2cae7848e00281a13b86eb2ef07f57588: Status 404 returned error can't find the container with id bef637fcaef84d268257dd4bd5d1e5e2cae7848e00281a13b86eb2ef07f57588 Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.186554 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8746-account-create-update-v57t2"] Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.245566 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bz2n7"] Mar 10 15:26:00 crc kubenswrapper[4743]: W0310 15:26:00.249467 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5f8c087_774e_49b4_b660_d3c4d9b72061.slice/crio-c55d1948153a5b19ea69e45be544c78ba45a8147186e4ed323dd334466e6cec8 WatchSource:0}: Error finding container c55d1948153a5b19ea69e45be544c78ba45a8147186e4ed323dd334466e6cec8: Status 404 returned error can't find the container with id c55d1948153a5b19ea69e45be544c78ba45a8147186e4ed323dd334466e6cec8 Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.290899 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5wqb\" (UniqueName: \"kubernetes.io/projected/4537e57c-2a6d-44f9-9724-2ad061bed454-kube-api-access-q5wqb\") pod \"auto-csr-approver-29552606-qc86p\" (UID: \"4537e57c-2a6d-44f9-9724-2ad061bed454\") " pod="openshift-infra/auto-csr-approver-29552606-qc86p" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.393549 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5wqb\" (UniqueName: \"kubernetes.io/projected/4537e57c-2a6d-44f9-9724-2ad061bed454-kube-api-access-q5wqb\") pod \"auto-csr-approver-29552606-qc86p\" (UID: \"4537e57c-2a6d-44f9-9724-2ad061bed454\") " pod="openshift-infra/auto-csr-approver-29552606-qc86p" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.413081 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5wqb\" (UniqueName: \"kubernetes.io/projected/4537e57c-2a6d-44f9-9724-2ad061bed454-kube-api-access-q5wqb\") pod \"auto-csr-approver-29552606-qc86p\" (UID: \"4537e57c-2a6d-44f9-9724-2ad061bed454\") " pod="openshift-infra/auto-csr-approver-29552606-qc86p" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.430182 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ada4-account-create-update-vgjvq"] Mar 10 15:26:00 crc kubenswrapper[4743]: W0310 15:26:00.437117 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6cbf843_671d_45e8_8844_f1c0c3d9e099.slice/crio-9fec6204fe9a2c9364a685b0a9cb241bc2c4806132676d16e73a5c13a07030d5 WatchSource:0}: Error finding container 9fec6204fe9a2c9364a685b0a9cb241bc2c4806132676d16e73a5c13a07030d5: Status 404 returned error can't find the container with id 9fec6204fe9a2c9364a685b0a9cb241bc2c4806132676d16e73a5c13a07030d5 Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.451081 4743 generic.go:334] "Generic (PLEG): container finished" podID="3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806" containerID="a2d3367b0007bc5df8892fc90c19e0dc1af34864bb7630c2d95926468bb1659d" exitCode=0 Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.451169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a335-account-create-update-bh55z" event={"ID":"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806","Type":"ContainerDied","Data":"a2d3367b0007bc5df8892fc90c19e0dc1af34864bb7630c2d95926468bb1659d"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.451199 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a335-account-create-update-bh55z" event={"ID":"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806","Type":"ContainerStarted","Data":"6a458f4f8b2039af2aaed9b3b5d171353e2479fc5f4cd3c17549f98d29e28ac1"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.455611 4743 generic.go:334] "Generic (PLEG): container finished" podID="7bc89d4c-f23b-4795-bfcd-96318a266339" containerID="ef22e1193204539d41cdc97b630e58390c221111525c11cb1d646bfcc7d278b1" exitCode=0 Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.455681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8kq9l" event={"ID":"7bc89d4c-f23b-4795-bfcd-96318a266339","Type":"ContainerDied","Data":"ef22e1193204539d41cdc97b630e58390c221111525c11cb1d646bfcc7d278b1"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.460901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bz2n7" event={"ID":"c5f8c087-774e-49b4-b660-d3c4d9b72061","Type":"ContainerStarted","Data":"9dbe58b8ead8597d746676b37427f444f7b3bbfc54ff179db882a8a5efd31812"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.460954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bz2n7" event={"ID":"c5f8c087-774e-49b4-b660-d3c4d9b72061","Type":"ContainerStarted","Data":"c55d1948153a5b19ea69e45be544c78ba45a8147186e4ed323dd334466e6cec8"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.477468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8746-account-create-update-v57t2" event={"ID":"a74cb34e-ac59-473e-aa30-2c148a81e0ea","Type":"ContainerStarted","Data":"305b046085f83c938e43b5e2730abb0fb27bea091b6845ae349b0f5afe67872a"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.477531 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8746-account-create-update-v57t2" event={"ID":"a74cb34e-ac59-473e-aa30-2c148a81e0ea","Type":"ContainerStarted","Data":"bef637fcaef84d268257dd4bd5d1e5e2cae7848e00281a13b86eb2ef07f57588"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.479906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vvwgs" event={"ID":"b8c53268-9723-45cc-b49b-068b245ea223","Type":"ContainerStarted","Data":"5693e71c8d0e859edaee1a6980c2dacd9bfea1a3fa8cd297dbf6d60811575a1c"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.479952 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vvwgs" event={"ID":"b8c53268-9723-45cc-b49b-068b245ea223","Type":"ContainerStarted","Data":"2e8d6d69942aad3dc491cbe1d0cbe59c7ee76f4325d8e26c757d7cf5d4be925f"} Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.499651 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552606-qc86p" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.518089 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-bz2n7" podStartSLOduration=1.518068713 podStartE2EDuration="1.518068713s" podCreationTimestamp="2026-03-10 15:25:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:00.515037956 +0000 UTC m=+1225.221852714" watchObservedRunningTime="2026-03-10 15:26:00.518068713 +0000 UTC m=+1225.224883461" Mar 10 15:26:00 crc kubenswrapper[4743]: I0310 15:26:00.536851 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8746-account-create-update-v57t2" podStartSLOduration=1.536803242 podStartE2EDuration="1.536803242s" podCreationTimestamp="2026-03-10 15:25:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:00.533521107 +0000 UTC m=+1225.240335855" watchObservedRunningTime="2026-03-10 15:26:00.536803242 +0000 UTC m=+1225.243617990" Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.029426 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ksg46" Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.104911 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552606-qc86p"] Mar 10 15:26:01 crc kubenswrapper[4743]: W0310 15:26:01.105094 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4537e57c_2a6d_44f9_9724_2ad061bed454.slice/crio-30faccb7bb6b4f7892d40944e6cdba2592e97d380aec8ea1bd8a410e3f730789 WatchSource:0}: Error finding container 30faccb7bb6b4f7892d40944e6cdba2592e97d380aec8ea1bd8a410e3f730789: Status 404 returned error can't find the container with id 30faccb7bb6b4f7892d40944e6cdba2592e97d380aec8ea1bd8a410e3f730789 Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.112462 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2094d4dd-213a-4db3-9768-321112175d9a-operator-scripts\") pod \"2094d4dd-213a-4db3-9768-321112175d9a\" (UID: \"2094d4dd-213a-4db3-9768-321112175d9a\") " Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.112593 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wfw7\" (UniqueName: \"kubernetes.io/projected/2094d4dd-213a-4db3-9768-321112175d9a-kube-api-access-8wfw7\") pod \"2094d4dd-213a-4db3-9768-321112175d9a\" (UID: \"2094d4dd-213a-4db3-9768-321112175d9a\") " Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.113281 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2094d4dd-213a-4db3-9768-321112175d9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2094d4dd-213a-4db3-9768-321112175d9a" (UID: "2094d4dd-213a-4db3-9768-321112175d9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.120916 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2094d4dd-213a-4db3-9768-321112175d9a-kube-api-access-8wfw7" (OuterVolumeSpecName: "kube-api-access-8wfw7") pod "2094d4dd-213a-4db3-9768-321112175d9a" (UID: "2094d4dd-213a-4db3-9768-321112175d9a"). InnerVolumeSpecName "kube-api-access-8wfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.215194 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2094d4dd-213a-4db3-9768-321112175d9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.215542 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wfw7\" (UniqueName: \"kubernetes.io/projected/2094d4dd-213a-4db3-9768-321112175d9a-kube-api-access-8wfw7\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.489285 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ksg46" Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.489313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ksg46" event={"ID":"2094d4dd-213a-4db3-9768-321112175d9a","Type":"ContainerDied","Data":"d3dd5c2b1ebeda048b51347648ad23f4d15d77e89fae17d97f40b15bf57454ae"} Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.489414 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3dd5c2b1ebeda048b51347648ad23f4d15d77e89fae17d97f40b15bf57454ae" Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.498357 4743 generic.go:334] "Generic (PLEG): container finished" podID="c5f8c087-774e-49b4-b660-d3c4d9b72061" containerID="9dbe58b8ead8597d746676b37427f444f7b3bbfc54ff179db882a8a5efd31812" exitCode=0 Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.498892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bz2n7" event={"ID":"c5f8c087-774e-49b4-b660-d3c4d9b72061","Type":"ContainerDied","Data":"9dbe58b8ead8597d746676b37427f444f7b3bbfc54ff179db882a8a5efd31812"} Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.501724 4743 generic.go:334] "Generic (PLEG): container finished" podID="a74cb34e-ac59-473e-aa30-2c148a81e0ea" containerID="305b046085f83c938e43b5e2730abb0fb27bea091b6845ae349b0f5afe67872a" exitCode=0 Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.501965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8746-account-create-update-v57t2" event={"ID":"a74cb34e-ac59-473e-aa30-2c148a81e0ea","Type":"ContainerDied","Data":"305b046085f83c938e43b5e2730abb0fb27bea091b6845ae349b0f5afe67872a"} Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.504178 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8c53268-9723-45cc-b49b-068b245ea223" containerID="5693e71c8d0e859edaee1a6980c2dacd9bfea1a3fa8cd297dbf6d60811575a1c" exitCode=0 Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.504262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vvwgs" event={"ID":"b8c53268-9723-45cc-b49b-068b245ea223","Type":"ContainerDied","Data":"5693e71c8d0e859edaee1a6980c2dacd9bfea1a3fa8cd297dbf6d60811575a1c"} Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.506910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552606-qc86p" event={"ID":"4537e57c-2a6d-44f9-9724-2ad061bed454","Type":"ContainerStarted","Data":"30faccb7bb6b4f7892d40944e6cdba2592e97d380aec8ea1bd8a410e3f730789"} Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.508714 4743 generic.go:334] "Generic (PLEG): container finished" podID="e6cbf843-671d-45e8-8844-f1c0c3d9e099" containerID="1ff7c6e5e6b8c6b0512b5575c96c5dd4d9388e3bb09a106006ce83f5f293c4bb" exitCode=0 Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.508916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ada4-account-create-update-vgjvq" event={"ID":"e6cbf843-671d-45e8-8844-f1c0c3d9e099","Type":"ContainerDied","Data":"1ff7c6e5e6b8c6b0512b5575c96c5dd4d9388e3bb09a106006ce83f5f293c4bb"} Mar 10 15:26:01 crc kubenswrapper[4743]: I0310 15:26:01.509015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ada4-account-create-update-vgjvq" event={"ID":"e6cbf843-671d-45e8-8844-f1c0c3d9e099","Type":"ContainerStarted","Data":"9fec6204fe9a2c9364a685b0a9cb241bc2c4806132676d16e73a5c13a07030d5"} Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.077435 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8kq9l" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.165791 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vvwgs" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.171473 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.271700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc89d4c-f23b-4795-bfcd-96318a266339-operator-scripts\") pod \"7bc89d4c-f23b-4795-bfcd-96318a266339\" (UID: \"7bc89d4c-f23b-4795-bfcd-96318a266339\") " Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.271757 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9cvb\" (UniqueName: \"kubernetes.io/projected/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-kube-api-access-t9cvb\") pod \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\" (UID: \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\") " Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.271853 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gcwn\" (UniqueName: \"kubernetes.io/projected/7bc89d4c-f23b-4795-bfcd-96318a266339-kube-api-access-6gcwn\") pod \"7bc89d4c-f23b-4795-bfcd-96318a266339\" (UID: \"7bc89d4c-f23b-4795-bfcd-96318a266339\") " Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.271934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp4pg\" (UniqueName: \"kubernetes.io/projected/b8c53268-9723-45cc-b49b-068b245ea223-kube-api-access-fp4pg\") pod \"b8c53268-9723-45cc-b49b-068b245ea223\" (UID: \"b8c53268-9723-45cc-b49b-068b245ea223\") " Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.272023 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c53268-9723-45cc-b49b-068b245ea223-operator-scripts\") pod \"b8c53268-9723-45cc-b49b-068b245ea223\" (UID: \"b8c53268-9723-45cc-b49b-068b245ea223\") " Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.272089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-operator-scripts\") pod \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\" (UID: \"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806\") " Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.273149 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c53268-9723-45cc-b49b-068b245ea223-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8c53268-9723-45cc-b49b-068b245ea223" (UID: "b8c53268-9723-45cc-b49b-068b245ea223"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.273150 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806" (UID: "3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.274138 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc89d4c-f23b-4795-bfcd-96318a266339-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bc89d4c-f23b-4795-bfcd-96318a266339" (UID: "7bc89d4c-f23b-4795-bfcd-96318a266339"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.274175 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c53268-9723-45cc-b49b-068b245ea223-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.274194 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.278575 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-kube-api-access-t9cvb" (OuterVolumeSpecName: "kube-api-access-t9cvb") pod "3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806" (UID: "3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806"). InnerVolumeSpecName "kube-api-access-t9cvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.278686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c53268-9723-45cc-b49b-068b245ea223-kube-api-access-fp4pg" (OuterVolumeSpecName: "kube-api-access-fp4pg") pod "b8c53268-9723-45cc-b49b-068b245ea223" (UID: "b8c53268-9723-45cc-b49b-068b245ea223"). InnerVolumeSpecName "kube-api-access-fp4pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.279714 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc89d4c-f23b-4795-bfcd-96318a266339-kube-api-access-6gcwn" (OuterVolumeSpecName: "kube-api-access-6gcwn") pod "7bc89d4c-f23b-4795-bfcd-96318a266339" (UID: "7bc89d4c-f23b-4795-bfcd-96318a266339"). InnerVolumeSpecName "kube-api-access-6gcwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.375697 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc89d4c-f23b-4795-bfcd-96318a266339-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.375744 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9cvb\" (UniqueName: \"kubernetes.io/projected/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806-kube-api-access-t9cvb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.375760 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gcwn\" (UniqueName: \"kubernetes.io/projected/7bc89d4c-f23b-4795-bfcd-96318a266339-kube-api-access-6gcwn\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.375776 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp4pg\" (UniqueName: \"kubernetes.io/projected/b8c53268-9723-45cc-b49b-068b245ea223-kube-api-access-fp4pg\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.516826 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a335-account-create-update-bh55z" event={"ID":"3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806","Type":"ContainerDied","Data":"6a458f4f8b2039af2aaed9b3b5d171353e2479fc5f4cd3c17549f98d29e28ac1"} Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.516879 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a458f4f8b2039af2aaed9b3b5d171353e2479fc5f4cd3c17549f98d29e28ac1" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.516945 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a335-account-create-update-bh55z" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.524625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552606-qc86p" event={"ID":"4537e57c-2a6d-44f9-9724-2ad061bed454","Type":"ContainerStarted","Data":"c3987d1d2cee35f5b6f2659fd2201aebc1fa70e42902b873ee1ac8fbf669cc47"} Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.526523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8kq9l" event={"ID":"7bc89d4c-f23b-4795-bfcd-96318a266339","Type":"ContainerDied","Data":"872109b12e75bf41b9f18cebd3d61722869aa47e496631444f348e81f398826a"} Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.526559 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="872109b12e75bf41b9f18cebd3d61722869aa47e496631444f348e81f398826a" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.526713 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8kq9l" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.528493 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vvwgs" event={"ID":"b8c53268-9723-45cc-b49b-068b245ea223","Type":"ContainerDied","Data":"2e8d6d69942aad3dc491cbe1d0cbe59c7ee76f4325d8e26c757d7cf5d4be925f"} Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.528531 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e8d6d69942aad3dc491cbe1d0cbe59c7ee76f4325d8e26c757d7cf5d4be925f" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.528746 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vvwgs" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.546610 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552606-qc86p" podStartSLOduration=1.514760732 podStartE2EDuration="2.546594441s" podCreationTimestamp="2026-03-10 15:26:00 +0000 UTC" firstStartedPulling="2026-03-10 15:26:01.108018272 +0000 UTC m=+1225.814833020" lastFinishedPulling="2026-03-10 15:26:02.139851991 +0000 UTC m=+1226.846666729" observedRunningTime="2026-03-10 15:26:02.53820855 +0000 UTC m=+1227.245023298" watchObservedRunningTime="2026-03-10 15:26:02.546594441 +0000 UTC m=+1227.253409189" Mar 10 15:26:02 crc kubenswrapper[4743]: I0310 15:26:02.895332 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.067019 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bz2n7" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.073031 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.089483 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5g7p\" (UniqueName: \"kubernetes.io/projected/a74cb34e-ac59-473e-aa30-2c148a81e0ea-kube-api-access-f5g7p\") pod \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\" (UID: \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\") " Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.089685 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74cb34e-ac59-473e-aa30-2c148a81e0ea-operator-scripts\") pod \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\" (UID: \"a74cb34e-ac59-473e-aa30-2c148a81e0ea\") " Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.090768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74cb34e-ac59-473e-aa30-2c148a81e0ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a74cb34e-ac59-473e-aa30-2c148a81e0ea" (UID: "a74cb34e-ac59-473e-aa30-2c148a81e0ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.104794 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74cb34e-ac59-473e-aa30-2c148a81e0ea-kube-api-access-f5g7p" (OuterVolumeSpecName: "kube-api-access-f5g7p") pod "a74cb34e-ac59-473e-aa30-2c148a81e0ea" (UID: "a74cb34e-ac59-473e-aa30-2c148a81e0ea"). InnerVolumeSpecName "kube-api-access-f5g7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.194868 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cbf843-671d-45e8-8844-f1c0c3d9e099-operator-scripts\") pod \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\" (UID: \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\") " Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.195037 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f8c087-774e-49b4-b660-d3c4d9b72061-operator-scripts\") pod \"c5f8c087-774e-49b4-b660-d3c4d9b72061\" (UID: \"c5f8c087-774e-49b4-b660-d3c4d9b72061\") " Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.195078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv6xz\" (UniqueName: \"kubernetes.io/projected/e6cbf843-671d-45e8-8844-f1c0c3d9e099-kube-api-access-zv6xz\") pod \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\" (UID: \"e6cbf843-671d-45e8-8844-f1c0c3d9e099\") " Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.195108 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm856\" (UniqueName: \"kubernetes.io/projected/c5f8c087-774e-49b4-b660-d3c4d9b72061-kube-api-access-cm856\") pod \"c5f8c087-774e-49b4-b660-d3c4d9b72061\" (UID: \"c5f8c087-774e-49b4-b660-d3c4d9b72061\") " Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.195617 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5g7p\" (UniqueName: \"kubernetes.io/projected/a74cb34e-ac59-473e-aa30-2c148a81e0ea-kube-api-access-f5g7p\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.195631 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74cb34e-ac59-473e-aa30-2c148a81e0ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.196728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f8c087-774e-49b4-b660-d3c4d9b72061-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5f8c087-774e-49b4-b660-d3c4d9b72061" (UID: "c5f8c087-774e-49b4-b660-d3c4d9b72061"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.197130 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cbf843-671d-45e8-8844-f1c0c3d9e099-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6cbf843-671d-45e8-8844-f1c0c3d9e099" (UID: "e6cbf843-671d-45e8-8844-f1c0c3d9e099"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.200213 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f8c087-774e-49b4-b660-d3c4d9b72061-kube-api-access-cm856" (OuterVolumeSpecName: "kube-api-access-cm856") pod "c5f8c087-774e-49b4-b660-d3c4d9b72061" (UID: "c5f8c087-774e-49b4-b660-d3c4d9b72061"). InnerVolumeSpecName "kube-api-access-cm856". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.208407 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cbf843-671d-45e8-8844-f1c0c3d9e099-kube-api-access-zv6xz" (OuterVolumeSpecName: "kube-api-access-zv6xz") pod "e6cbf843-671d-45e8-8844-f1c0c3d9e099" (UID: "e6cbf843-671d-45e8-8844-f1c0c3d9e099"). InnerVolumeSpecName "kube-api-access-zv6xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.297525 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f8c087-774e-49b4-b660-d3c4d9b72061-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.297577 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv6xz\" (UniqueName: \"kubernetes.io/projected/e6cbf843-671d-45e8-8844-f1c0c3d9e099-kube-api-access-zv6xz\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.297593 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm856\" (UniqueName: \"kubernetes.io/projected/c5f8c087-774e-49b4-b660-d3c4d9b72061-kube-api-access-cm856\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.297604 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cbf843-671d-45e8-8844-f1c0c3d9e099-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.563652 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zrnmr"] Mar 10 15:26:03 crc kubenswrapper[4743]: E0310 15:26:03.564512 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc89d4c-f23b-4795-bfcd-96318a266339" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.564531 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc89d4c-f23b-4795-bfcd-96318a266339" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: E0310 15:26:03.564575 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2094d4dd-213a-4db3-9768-321112175d9a" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.564584 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2094d4dd-213a-4db3-9768-321112175d9a" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: E0310 15:26:03.564609 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.564618 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: E0310 15:26:03.564655 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c53268-9723-45cc-b49b-068b245ea223" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.564663 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c53268-9723-45cc-b49b-068b245ea223" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: E0310 15:26:03.564674 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cbf843-671d-45e8-8844-f1c0c3d9e099" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.564680 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cbf843-671d-45e8-8844-f1c0c3d9e099" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: E0310 15:26:03.564693 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74cb34e-ac59-473e-aa30-2c148a81e0ea" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.564699 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74cb34e-ac59-473e-aa30-2c148a81e0ea" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: E0310 15:26:03.564731 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f8c087-774e-49b4-b660-d3c4d9b72061" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.564739 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f8c087-774e-49b4-b660-d3c4d9b72061" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.565196 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f8c087-774e-49b4-b660-d3c4d9b72061" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.565218 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c53268-9723-45cc-b49b-068b245ea223" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.565231 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2094d4dd-213a-4db3-9768-321112175d9a" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.565265 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.565276 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74cb34e-ac59-473e-aa30-2c148a81e0ea" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.565288 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cbf843-671d-45e8-8844-f1c0c3d9e099" containerName="mariadb-account-create-update" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.565369 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc89d4c-f23b-4795-bfcd-96318a266339" containerName="mariadb-database-create" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.566619 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.578067 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.578525 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wh995" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.579015 4743 generic.go:334] "Generic (PLEG): container finished" podID="4537e57c-2a6d-44f9-9724-2ad061bed454" containerID="c3987d1d2cee35f5b6f2659fd2201aebc1fa70e42902b873ee1ac8fbf669cc47" exitCode=0 Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.579161 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552606-qc86p" event={"ID":"4537e57c-2a6d-44f9-9724-2ad061bed454","Type":"ContainerDied","Data":"c3987d1d2cee35f5b6f2659fd2201aebc1fa70e42902b873ee1ac8fbf669cc47"} Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.582388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ada4-account-create-update-vgjvq" event={"ID":"e6cbf843-671d-45e8-8844-f1c0c3d9e099","Type":"ContainerDied","Data":"9fec6204fe9a2c9364a685b0a9cb241bc2c4806132676d16e73a5c13a07030d5"} Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.582443 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fec6204fe9a2c9364a685b0a9cb241bc2c4806132676d16e73a5c13a07030d5" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.582533 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ada4-account-create-update-vgjvq" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.583926 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zrnmr"] Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.585495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bz2n7" event={"ID":"c5f8c087-774e-49b4-b660-d3c4d9b72061","Type":"ContainerDied","Data":"c55d1948153a5b19ea69e45be544c78ba45a8147186e4ed323dd334466e6cec8"} Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.585543 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c55d1948153a5b19ea69e45be544c78ba45a8147186e4ed323dd334466e6cec8" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.585560 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bz2n7" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.587300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8746-account-create-update-v57t2" event={"ID":"a74cb34e-ac59-473e-aa30-2c148a81e0ea","Type":"ContainerDied","Data":"bef637fcaef84d268257dd4bd5d1e5e2cae7848e00281a13b86eb2ef07f57588"} Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.587356 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bef637fcaef84d268257dd4bd5d1e5e2cae7848e00281a13b86eb2ef07f57588" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.587445 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8746-account-create-update-v57t2" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.718387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-config-data\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.718668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-db-sync-config-data\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.718839 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgl2q\" (UniqueName: \"kubernetes.io/projected/496f6307-7603-4bfb-8524-86fd78005b43-kube-api-access-dgl2q\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.718894 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-combined-ca-bundle\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.819934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-config-data\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.820052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-db-sync-config-data\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.820097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgl2q\" (UniqueName: \"kubernetes.io/projected/496f6307-7603-4bfb-8524-86fd78005b43-kube-api-access-dgl2q\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.820120 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-combined-ca-bundle\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.824569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-combined-ca-bundle\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.824584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-config-data\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.824626 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-db-sync-config-data\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.840429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgl2q\" (UniqueName: \"kubernetes.io/projected/496f6307-7603-4bfb-8524-86fd78005b43-kube-api-access-dgl2q\") pod \"glance-db-sync-zrnmr\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:03 crc kubenswrapper[4743]: I0310 15:26:03.903890 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:04 crc kubenswrapper[4743]: I0310 15:26:04.469928 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zrnmr"] Mar 10 15:26:04 crc kubenswrapper[4743]: W0310 15:26:04.477984 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod496f6307_7603_4bfb_8524_86fd78005b43.slice/crio-09e8437bb3adda224bda1b20a109b76c2be3b59d2c44d12d99fd0186f79797ce WatchSource:0}: Error finding container 09e8437bb3adda224bda1b20a109b76c2be3b59d2c44d12d99fd0186f79797ce: Status 404 returned error can't find the container with id 09e8437bb3adda224bda1b20a109b76c2be3b59d2c44d12d99fd0186f79797ce Mar 10 15:26:04 crc kubenswrapper[4743]: I0310 15:26:04.479610 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:26:04 crc kubenswrapper[4743]: I0310 15:26:04.595187 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zrnmr" event={"ID":"496f6307-7603-4bfb-8524-86fd78005b43","Type":"ContainerStarted","Data":"09e8437bb3adda224bda1b20a109b76c2be3b59d2c44d12d99fd0186f79797ce"} Mar 10 15:26:04 crc kubenswrapper[4743]: I0310 15:26:04.899355 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552606-qc86p" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.041899 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5wqb\" (UniqueName: \"kubernetes.io/projected/4537e57c-2a6d-44f9-9724-2ad061bed454-kube-api-access-q5wqb\") pod \"4537e57c-2a6d-44f9-9724-2ad061bed454\" (UID: \"4537e57c-2a6d-44f9-9724-2ad061bed454\") " Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.058313 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ksg46"] Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.061285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4537e57c-2a6d-44f9-9724-2ad061bed454-kube-api-access-q5wqb" (OuterVolumeSpecName: "kube-api-access-q5wqb") pod "4537e57c-2a6d-44f9-9724-2ad061bed454" (UID: "4537e57c-2a6d-44f9-9724-2ad061bed454"). InnerVolumeSpecName "kube-api-access-q5wqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.064975 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ksg46"] Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.140913 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7998x"] Mar 10 15:26:05 crc kubenswrapper[4743]: E0310 15:26:05.141395 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4537e57c-2a6d-44f9-9724-2ad061bed454" containerName="oc" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.141416 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4537e57c-2a6d-44f9-9724-2ad061bed454" containerName="oc" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.141643 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4537e57c-2a6d-44f9-9724-2ad061bed454" containerName="oc" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.142360 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7998x" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.144447 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz9ml\" (UniqueName: \"kubernetes.io/projected/50276ee0-8660-4161-930a-bda4dd8d92a6-kube-api-access-vz9ml\") pod \"root-account-create-update-7998x\" (UID: \"50276ee0-8660-4161-930a-bda4dd8d92a6\") " pod="openstack/root-account-create-update-7998x" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.144507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50276ee0-8660-4161-930a-bda4dd8d92a6-operator-scripts\") pod \"root-account-create-update-7998x\" (UID: \"50276ee0-8660-4161-930a-bda4dd8d92a6\") " pod="openstack/root-account-create-update-7998x" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.144618 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.144660 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5wqb\" (UniqueName: \"kubernetes.io/projected/4537e57c-2a6d-44f9-9724-2ad061bed454-kube-api-access-q5wqb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.149830 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7998x"] Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.245948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz9ml\" (UniqueName: \"kubernetes.io/projected/50276ee0-8660-4161-930a-bda4dd8d92a6-kube-api-access-vz9ml\") pod \"root-account-create-update-7998x\" (UID: \"50276ee0-8660-4161-930a-bda4dd8d92a6\") " pod="openstack/root-account-create-update-7998x" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.246006 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50276ee0-8660-4161-930a-bda4dd8d92a6-operator-scripts\") pod \"root-account-create-update-7998x\" (UID: \"50276ee0-8660-4161-930a-bda4dd8d92a6\") " pod="openstack/root-account-create-update-7998x" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.247163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50276ee0-8660-4161-930a-bda4dd8d92a6-operator-scripts\") pod \"root-account-create-update-7998x\" (UID: \"50276ee0-8660-4161-930a-bda4dd8d92a6\") " pod="openstack/root-account-create-update-7998x" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.266093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz9ml\" (UniqueName: \"kubernetes.io/projected/50276ee0-8660-4161-930a-bda4dd8d92a6-kube-api-access-vz9ml\") pod \"root-account-create-update-7998x\" (UID: \"50276ee0-8660-4161-930a-bda4dd8d92a6\") " pod="openstack/root-account-create-update-7998x" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.470526 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7998x" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.602152 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-lknsq"] Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.609442 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-lknsq"] Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.624741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552606-qc86p" event={"ID":"4537e57c-2a6d-44f9-9724-2ad061bed454","Type":"ContainerDied","Data":"30faccb7bb6b4f7892d40944e6cdba2592e97d380aec8ea1bd8a410e3f730789"} Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.624781 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30faccb7bb6b4f7892d40944e6cdba2592e97d380aec8ea1bd8a410e3f730789" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.624858 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552606-qc86p" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.953078 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2094d4dd-213a-4db3-9768-321112175d9a" path="/var/lib/kubelet/pods/2094d4dd-213a-4db3-9768-321112175d9a/volumes" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.954611 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e27721-5b31-46bb-9514-0b3691820891" path="/var/lib/kubelet/pods/45e27721-5b31-46bb-9514-0b3691820891/volumes" Mar 10 15:26:05 crc kubenswrapper[4743]: I0310 15:26:05.979333 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7998x"] Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.220992 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.287732 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z6kh5"] Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.287999 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-z6kh5" podUID="f7975923-a8ea-4269-b285-6fa5a4d639ad" containerName="dnsmasq-dns" containerID="cri-o://da168f79a7b207b9ee1d34d14329d77acf028437bdbecf46287f05dd97913ccc" gracePeriod=10 Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.644575 4743 generic.go:334] "Generic (PLEG): container finished" podID="f7975923-a8ea-4269-b285-6fa5a4d639ad" containerID="da168f79a7b207b9ee1d34d14329d77acf028437bdbecf46287f05dd97913ccc" exitCode=0 Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.644658 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z6kh5" event={"ID":"f7975923-a8ea-4269-b285-6fa5a4d639ad","Type":"ContainerDied","Data":"da168f79a7b207b9ee1d34d14329d77acf028437bdbecf46287f05dd97913ccc"} Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.646833 4743 generic.go:334] "Generic (PLEG): container finished" podID="1f2a6755-0e08-482b-9815-88840f35fb4e" containerID="9c7363e95d1deab266462107dd5f529c84ce7d11509454f9342c7b22d584e622" exitCode=0 Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.646839 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lbfhc" event={"ID":"1f2a6755-0e08-482b-9815-88840f35fb4e","Type":"ContainerDied","Data":"9c7363e95d1deab266462107dd5f529c84ce7d11509454f9342c7b22d584e622"} Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.648514 4743 generic.go:334] "Generic (PLEG): container finished" podID="50276ee0-8660-4161-930a-bda4dd8d92a6" containerID="f42da41bad26b0eb018c23cb059e91151a8898f2abaa19696fef521470bd0419" exitCode=0 Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.648546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7998x" event={"ID":"50276ee0-8660-4161-930a-bda4dd8d92a6","Type":"ContainerDied","Data":"f42da41bad26b0eb018c23cb059e91151a8898f2abaa19696fef521470bd0419"} Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.648567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7998x" event={"ID":"50276ee0-8660-4161-930a-bda4dd8d92a6","Type":"ContainerStarted","Data":"e468378069c87eb7348023cf2d0724f3d42327b92f0df739a64ec855109ecaaf"} Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.768685 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.875749 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-sb\") pod \"f7975923-a8ea-4269-b285-6fa5a4d639ad\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.875826 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-config\") pod \"f7975923-a8ea-4269-b285-6fa5a4d639ad\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.875862 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wvvn\" (UniqueName: \"kubernetes.io/projected/f7975923-a8ea-4269-b285-6fa5a4d639ad-kube-api-access-2wvvn\") pod \"f7975923-a8ea-4269-b285-6fa5a4d639ad\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.875920 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-dns-svc\") pod \"f7975923-a8ea-4269-b285-6fa5a4d639ad\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.876039 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-nb\") pod \"f7975923-a8ea-4269-b285-6fa5a4d639ad\" (UID: \"f7975923-a8ea-4269-b285-6fa5a4d639ad\") " Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.887382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7975923-a8ea-4269-b285-6fa5a4d639ad-kube-api-access-2wvvn" (OuterVolumeSpecName: "kube-api-access-2wvvn") pod "f7975923-a8ea-4269-b285-6fa5a4d639ad" (UID: "f7975923-a8ea-4269-b285-6fa5a4d639ad"). InnerVolumeSpecName "kube-api-access-2wvvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.915792 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7975923-a8ea-4269-b285-6fa5a4d639ad" (UID: "f7975923-a8ea-4269-b285-6fa5a4d639ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.921009 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7975923-a8ea-4269-b285-6fa5a4d639ad" (UID: "f7975923-a8ea-4269-b285-6fa5a4d639ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.923411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7975923-a8ea-4269-b285-6fa5a4d639ad" (UID: "f7975923-a8ea-4269-b285-6fa5a4d639ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.924505 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-config" (OuterVolumeSpecName: "config") pod "f7975923-a8ea-4269-b285-6fa5a4d639ad" (UID: "f7975923-a8ea-4269-b285-6fa5a4d639ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.993126 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.993186 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.993246 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.993309 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wvvn\" (UniqueName: \"kubernetes.io/projected/f7975923-a8ea-4269-b285-6fa5a4d639ad-kube-api-access-2wvvn\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:06 crc kubenswrapper[4743]: I0310 15:26:06.993333 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7975923-a8ea-4269-b285-6fa5a4d639ad-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.661639 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z6kh5" Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.661663 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z6kh5" event={"ID":"f7975923-a8ea-4269-b285-6fa5a4d639ad","Type":"ContainerDied","Data":"635b6cd8676edbf55b6db584d99304b8d6a45b516a039971fdd760f8f0a61c2b"} Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.661861 4743 scope.go:117] "RemoveContainer" containerID="da168f79a7b207b9ee1d34d14329d77acf028437bdbecf46287f05dd97913ccc" Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.668916 4743 generic.go:334] "Generic (PLEG): container finished" podID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerID="642a6c5de3968f103baa5bbf0937b99b8102f35959c7bfcd09d2ddb3d9449e34" exitCode=0 Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.668991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"884d5267-4e85-481c-96f0-eb31b88bfe67","Type":"ContainerDied","Data":"642a6c5de3968f103baa5bbf0937b99b8102f35959c7bfcd09d2ddb3d9449e34"} Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.677319 4743 generic.go:334] "Generic (PLEG): container finished" podID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerID="290b8969232645596b41011b8e17a74e528a28bee07b274203a8b937c2e4b355" exitCode=0 Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.678969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7541c4b7-eda5-4cd5-b0c4-c00621726c2b","Type":"ContainerDied","Data":"290b8969232645596b41011b8e17a74e528a28bee07b274203a8b937c2e4b355"} Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.781241 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z6kh5"] Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.793627 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z6kh5"] Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.876832 4743 scope.go:117] "RemoveContainer" containerID="d0134f01d6d634c3742660c7f3f08906becbb9d03ab4b52225a868c555cb9d58" Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.911150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.919755 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05770cd2-4275-4fcc-bd98-f8951c4d91ba-etc-swift\") pod \"swift-storage-0\" (UID: \"05770cd2-4275-4fcc-bd98-f8951c4d91ba\") " pod="openstack/swift-storage-0" Mar 10 15:26:07 crc kubenswrapper[4743]: I0310 15:26:07.931956 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7975923-a8ea-4269-b285-6fa5a4d639ad" path="/var/lib/kubelet/pods/f7975923-a8ea-4269-b285-6fa5a4d639ad/volumes" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.140859 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7998x" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.208436 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.220769 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.329654 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-scripts\") pod \"1f2a6755-0e08-482b-9815-88840f35fb4e\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.329738 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-combined-ca-bundle\") pod \"1f2a6755-0e08-482b-9815-88840f35fb4e\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.329781 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz9ml\" (UniqueName: \"kubernetes.io/projected/50276ee0-8660-4161-930a-bda4dd8d92a6-kube-api-access-vz9ml\") pod \"50276ee0-8660-4161-930a-bda4dd8d92a6\" (UID: \"50276ee0-8660-4161-930a-bda4dd8d92a6\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.329806 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-dispersionconf\") pod \"1f2a6755-0e08-482b-9815-88840f35fb4e\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.329954 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-ring-data-devices\") pod \"1f2a6755-0e08-482b-9815-88840f35fb4e\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.329998 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50276ee0-8660-4161-930a-bda4dd8d92a6-operator-scripts\") pod \"50276ee0-8660-4161-930a-bda4dd8d92a6\" (UID: \"50276ee0-8660-4161-930a-bda4dd8d92a6\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.330107 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f2a6755-0e08-482b-9815-88840f35fb4e-etc-swift\") pod \"1f2a6755-0e08-482b-9815-88840f35fb4e\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.330533 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-swiftconf\") pod \"1f2a6755-0e08-482b-9815-88840f35fb4e\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.330680 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm7xp\" (UniqueName: \"kubernetes.io/projected/1f2a6755-0e08-482b-9815-88840f35fb4e-kube-api-access-pm7xp\") pod \"1f2a6755-0e08-482b-9815-88840f35fb4e\" (UID: \"1f2a6755-0e08-482b-9815-88840f35fb4e\") " Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.331406 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50276ee0-8660-4161-930a-bda4dd8d92a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50276ee0-8660-4161-930a-bda4dd8d92a6" (UID: "50276ee0-8660-4161-930a-bda4dd8d92a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.331985 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50276ee0-8660-4161-930a-bda4dd8d92a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.332553 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2a6755-0e08-482b-9815-88840f35fb4e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1f2a6755-0e08-482b-9815-88840f35fb4e" (UID: "1f2a6755-0e08-482b-9815-88840f35fb4e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.334155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1f2a6755-0e08-482b-9815-88840f35fb4e" (UID: "1f2a6755-0e08-482b-9815-88840f35fb4e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.346083 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2a6755-0e08-482b-9815-88840f35fb4e-kube-api-access-pm7xp" (OuterVolumeSpecName: "kube-api-access-pm7xp") pod "1f2a6755-0e08-482b-9815-88840f35fb4e" (UID: "1f2a6755-0e08-482b-9815-88840f35fb4e"). InnerVolumeSpecName "kube-api-access-pm7xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.353722 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50276ee0-8660-4161-930a-bda4dd8d92a6-kube-api-access-vz9ml" (OuterVolumeSpecName: "kube-api-access-vz9ml") pod "50276ee0-8660-4161-930a-bda4dd8d92a6" (UID: "50276ee0-8660-4161-930a-bda4dd8d92a6"). InnerVolumeSpecName "kube-api-access-vz9ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.370594 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1f2a6755-0e08-482b-9815-88840f35fb4e" (UID: "1f2a6755-0e08-482b-9815-88840f35fb4e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.373114 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1f2a6755-0e08-482b-9815-88840f35fb4e" (UID: "1f2a6755-0e08-482b-9815-88840f35fb4e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.379207 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-scripts" (OuterVolumeSpecName: "scripts") pod "1f2a6755-0e08-482b-9815-88840f35fb4e" (UID: "1f2a6755-0e08-482b-9815-88840f35fb4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.382164 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f2a6755-0e08-482b-9815-88840f35fb4e" (UID: "1f2a6755-0e08-482b-9815-88840f35fb4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.434467 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.434503 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.434520 4743 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.434534 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz9ml\" (UniqueName: \"kubernetes.io/projected/50276ee0-8660-4161-930a-bda4dd8d92a6-kube-api-access-vz9ml\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.434569 4743 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f2a6755-0e08-482b-9815-88840f35fb4e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.434582 4743 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f2a6755-0e08-482b-9815-88840f35fb4e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.434593 4743 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f2a6755-0e08-482b-9815-88840f35fb4e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.434604 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm7xp\" (UniqueName: \"kubernetes.io/projected/1f2a6755-0e08-482b-9815-88840f35fb4e-kube-api-access-pm7xp\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.704169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7998x" event={"ID":"50276ee0-8660-4161-930a-bda4dd8d92a6","Type":"ContainerDied","Data":"e468378069c87eb7348023cf2d0724f3d42327b92f0df739a64ec855109ecaaf"} Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.704743 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e468378069c87eb7348023cf2d0724f3d42327b92f0df739a64ec855109ecaaf" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.704861 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7998x" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.718465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7541c4b7-eda5-4cd5-b0c4-c00621726c2b","Type":"ContainerStarted","Data":"07e58d41414928253bfac6c3afe501ba6c33a78dd94b06a2b744a115e761b46c"} Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.718898 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.726932 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"884d5267-4e85-481c-96f0-eb31b88bfe67","Type":"ContainerStarted","Data":"039279fbc32f9a50c4bfda5f3da5c7d4eb8a1582a58c193b5befb912702ef757"} Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.727763 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.730920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lbfhc" event={"ID":"1f2a6755-0e08-482b-9815-88840f35fb4e","Type":"ContainerDied","Data":"e18380f6b59496e7450bba73db26cf34ecdb304026cbda48df1f6f60277232da"} Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.730968 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e18380f6b59496e7450bba73db26cf34ecdb304026cbda48df1f6f60277232da" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.731063 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lbfhc" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.756720 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.196057785 podStartE2EDuration="55.756698219s" podCreationTimestamp="2026-03-10 15:25:13 +0000 UTC" firstStartedPulling="2026-03-10 15:25:16.033865087 +0000 UTC m=+1180.740679835" lastFinishedPulling="2026-03-10 15:25:33.594505521 +0000 UTC m=+1198.301320269" observedRunningTime="2026-03-10 15:26:08.755756152 +0000 UTC m=+1233.462570920" watchObservedRunningTime="2026-03-10 15:26:08.756698219 +0000 UTC m=+1233.463512967" Mar 10 15:26:08 crc kubenswrapper[4743]: I0310 15:26:08.935738 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.956515056 podStartE2EDuration="55.93571375s" podCreationTimestamp="2026-03-10 15:25:13 +0000 UTC" firstStartedPulling="2026-03-10 15:25:15.502225435 +0000 UTC m=+1180.209040183" lastFinishedPulling="2026-03-10 15:25:33.481424119 +0000 UTC m=+1198.188238877" observedRunningTime="2026-03-10 15:26:08.899842284 +0000 UTC m=+1233.606657042" watchObservedRunningTime="2026-03-10 15:26:08.93571375 +0000 UTC m=+1233.642528498" Mar 10 15:26:09 crc kubenswrapper[4743]: I0310 15:26:09.154769 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 15:26:09 crc kubenswrapper[4743]: I0310 15:26:09.745241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"65a756b981d357b435d0d6b1bc7409bd583ba2fa84bd8b0b2745be94195f531c"} Mar 10 15:26:10 crc kubenswrapper[4743]: I0310 15:26:10.191486 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.252607 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.252990 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.253072 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.253832 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e87aabce4b79954ca871a938cf484c77623556f05115d13359a5bd9f0c4154c7"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.253886 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://e87aabce4b79954ca871a938cf484c77623556f05115d13359a5bd9f0c4154c7" gracePeriod=600 Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.452677 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7998x"] Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.476347 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7998x"] Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.763617 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="e87aabce4b79954ca871a938cf484c77623556f05115d13359a5bd9f0c4154c7" exitCode=0 Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.763693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"e87aabce4b79954ca871a938cf484c77623556f05115d13359a5bd9f0c4154c7"} Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.763745 4743 scope.go:117] "RemoveContainer" containerID="9f3c2bb9de0715122300a23c69b01248c82e5f8f70b4cad7c24ef747b724b4d8" Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.766985 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"97b3db232a6902f481873d8ff6dd961726bbb6d6e9be62db7b8b2d69e1834d5a"} Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.767007 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"bd791aa6166405fd44bd6e33c89a543e26abbd08862cbf6932491eed7d8108b6"} Mar 10 15:26:11 crc kubenswrapper[4743]: I0310 15:26:11.927700 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50276ee0-8660-4161-930a-bda4dd8d92a6" path="/var/lib/kubelet/pods/50276ee0-8660-4161-930a-bda4dd8d92a6/volumes" Mar 10 15:26:13 crc kubenswrapper[4743]: I0310 15:26:13.985437 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x7xkr" podUID="b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64" containerName="ovn-controller" probeResult="failure" output=< Mar 10 15:26:13 crc kubenswrapper[4743]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 15:26:13 crc kubenswrapper[4743]: > Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.479076 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9ltml"] Mar 10 15:26:16 crc kubenswrapper[4743]: E0310 15:26:16.479968 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7975923-a8ea-4269-b285-6fa5a4d639ad" containerName="dnsmasq-dns" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.480007 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7975923-a8ea-4269-b285-6fa5a4d639ad" containerName="dnsmasq-dns" Mar 10 15:26:16 crc kubenswrapper[4743]: E0310 15:26:16.480027 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50276ee0-8660-4161-930a-bda4dd8d92a6" containerName="mariadb-account-create-update" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.480034 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="50276ee0-8660-4161-930a-bda4dd8d92a6" containerName="mariadb-account-create-update" Mar 10 15:26:16 crc kubenswrapper[4743]: E0310 15:26:16.480044 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2a6755-0e08-482b-9815-88840f35fb4e" containerName="swift-ring-rebalance" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.480050 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2a6755-0e08-482b-9815-88840f35fb4e" containerName="swift-ring-rebalance" Mar 10 15:26:16 crc kubenswrapper[4743]: E0310 15:26:16.480086 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7975923-a8ea-4269-b285-6fa5a4d639ad" containerName="init" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.480092 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7975923-a8ea-4269-b285-6fa5a4d639ad" containerName="init" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.480325 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2a6755-0e08-482b-9815-88840f35fb4e" containerName="swift-ring-rebalance" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.480335 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="50276ee0-8660-4161-930a-bda4dd8d92a6" containerName="mariadb-account-create-update" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.480353 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7975923-a8ea-4269-b285-6fa5a4d639ad" containerName="dnsmasq-dns" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.484052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.486867 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.491016 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9ltml"] Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.598009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d1127c-77f2-479e-8361-6d0736eee46b-operator-scripts\") pod \"root-account-create-update-9ltml\" (UID: \"f5d1127c-77f2-479e-8361-6d0736eee46b\") " pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.598053 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp9pr\" (UniqueName: \"kubernetes.io/projected/f5d1127c-77f2-479e-8361-6d0736eee46b-kube-api-access-kp9pr\") pod \"root-account-create-update-9ltml\" (UID: \"f5d1127c-77f2-479e-8361-6d0736eee46b\") " pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.699360 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d1127c-77f2-479e-8361-6d0736eee46b-operator-scripts\") pod \"root-account-create-update-9ltml\" (UID: \"f5d1127c-77f2-479e-8361-6d0736eee46b\") " pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.699407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp9pr\" (UniqueName: \"kubernetes.io/projected/f5d1127c-77f2-479e-8361-6d0736eee46b-kube-api-access-kp9pr\") pod \"root-account-create-update-9ltml\" (UID: \"f5d1127c-77f2-479e-8361-6d0736eee46b\") " pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.700465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d1127c-77f2-479e-8361-6d0736eee46b-operator-scripts\") pod \"root-account-create-update-9ltml\" (UID: \"f5d1127c-77f2-479e-8361-6d0736eee46b\") " pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.719651 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp9pr\" (UniqueName: \"kubernetes.io/projected/f5d1127c-77f2-479e-8361-6d0736eee46b-kube-api-access-kp9pr\") pod \"root-account-create-update-9ltml\" (UID: \"f5d1127c-77f2-479e-8361-6d0736eee46b\") " pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:16 crc kubenswrapper[4743]: I0310 15:26:16.810532 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.263742 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9ltml"] Mar 10 15:26:18 crc kubenswrapper[4743]: W0310 15:26:18.274497 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d1127c_77f2_479e_8361_6d0736eee46b.slice/crio-fb1cfc95935a79a6923c4d86633c5c5320c4f138b8905fe2b9b805383665716e WatchSource:0}: Error finding container fb1cfc95935a79a6923c4d86633c5c5320c4f138b8905fe2b9b805383665716e: Status 404 returned error can't find the container with id fb1cfc95935a79a6923c4d86633c5c5320c4f138b8905fe2b9b805383665716e Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.834688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"fd61460384d6cf2cf3d97e1581a2275d487c44dd87d687ca072c07eb9d139f79"} Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.837233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zrnmr" event={"ID":"496f6307-7603-4bfb-8524-86fd78005b43","Type":"ContainerStarted","Data":"0b6b14849e9c1267474dd716845908ea7eac67f43f201fe03212fd3307a38fed"} Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.839927 4743 generic.go:334] "Generic (PLEG): container finished" podID="f5d1127c-77f2-479e-8361-6d0736eee46b" containerID="63e08faef298f1ca013fc2eb3b8539599ba6cd5f8a29056bf9a00ae5cb615025" exitCode=0 Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.839982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9ltml" event={"ID":"f5d1127c-77f2-479e-8361-6d0736eee46b","Type":"ContainerDied","Data":"63e08faef298f1ca013fc2eb3b8539599ba6cd5f8a29056bf9a00ae5cb615025"} Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.840106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9ltml" event={"ID":"f5d1127c-77f2-479e-8361-6d0736eee46b","Type":"ContainerStarted","Data":"fb1cfc95935a79a6923c4d86633c5c5320c4f138b8905fe2b9b805383665716e"} Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.842288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"f84ab8dbc2b95fdd7d39fab8ca7d8d1532bd03f1327a416f41437c9dbfaeb3a1"} Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.842311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"1dbd0095bbe0c21e863ab4de080e38f2d0324222887c04a65860f01bfc1ffda9"} Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.892243 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zrnmr" podStartSLOduration=2.5541176549999998 podStartE2EDuration="15.892225268s" podCreationTimestamp="2026-03-10 15:26:03 +0000 UTC" firstStartedPulling="2026-03-10 15:26:04.479420667 +0000 UTC m=+1229.186235415" lastFinishedPulling="2026-03-10 15:26:17.81752828 +0000 UTC m=+1242.524343028" observedRunningTime="2026-03-10 15:26:18.878801268 +0000 UTC m=+1243.585616016" watchObservedRunningTime="2026-03-10 15:26:18.892225268 +0000 UTC m=+1243.599040016" Mar 10 15:26:18 crc kubenswrapper[4743]: I0310 15:26:18.995058 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x7xkr" podUID="b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64" containerName="ovn-controller" probeResult="failure" output=< Mar 10 15:26:18 crc kubenswrapper[4743]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 15:26:18 crc kubenswrapper[4743]: > Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.007630 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.020197 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2xslw" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.246163 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x7xkr-config-gvv6j"] Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.247659 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.258766 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.282879 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7xkr-config-gvv6j"] Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.365551 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-scripts\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.365619 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-log-ovn\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.365655 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knxjc\" (UniqueName: \"kubernetes.io/projected/480c6589-84e6-4994-814f-020b9fbfb555-kube-api-access-knxjc\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.365692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run-ovn\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.365807 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-additional-scripts\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.365885 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.467795 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-additional-scripts\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.467968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.468065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-scripts\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.468145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-log-ovn\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.468222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knxjc\" (UniqueName: \"kubernetes.io/projected/480c6589-84e6-4994-814f-020b9fbfb555-kube-api-access-knxjc\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.468303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run-ovn\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.468543 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run-ovn\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.468627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.468546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-log-ovn\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.468735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-additional-scripts\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.470264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-scripts\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.495730 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knxjc\" (UniqueName: \"kubernetes.io/projected/480c6589-84e6-4994-814f-020b9fbfb555-kube-api-access-knxjc\") pod \"ovn-controller-x7xkr-config-gvv6j\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.574749 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:19 crc kubenswrapper[4743]: I0310 15:26:19.857745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"f1390ff540739ed276d1f3cd42203c944dad5623a929aa796da8380b4601fde1"} Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.269269 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7xkr-config-gvv6j"] Mar 10 15:26:20 crc kubenswrapper[4743]: W0310 15:26:20.282903 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod480c6589_84e6_4994_814f_020b9fbfb555.slice/crio-dec782617f6ee9e2f662f9856bc2ae41cfcfba6d2966017691574aa5f0c17a59 WatchSource:0}: Error finding container dec782617f6ee9e2f662f9856bc2ae41cfcfba6d2966017691574aa5f0c17a59: Status 404 returned error can't find the container with id dec782617f6ee9e2f662f9856bc2ae41cfcfba6d2966017691574aa5f0c17a59 Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.309244 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.387514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d1127c-77f2-479e-8361-6d0736eee46b-operator-scripts\") pod \"f5d1127c-77f2-479e-8361-6d0736eee46b\" (UID: \"f5d1127c-77f2-479e-8361-6d0736eee46b\") " Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.387750 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp9pr\" (UniqueName: \"kubernetes.io/projected/f5d1127c-77f2-479e-8361-6d0736eee46b-kube-api-access-kp9pr\") pod \"f5d1127c-77f2-479e-8361-6d0736eee46b\" (UID: \"f5d1127c-77f2-479e-8361-6d0736eee46b\") " Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.393289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d1127c-77f2-479e-8361-6d0736eee46b-kube-api-access-kp9pr" (OuterVolumeSpecName: "kube-api-access-kp9pr") pod "f5d1127c-77f2-479e-8361-6d0736eee46b" (UID: "f5d1127c-77f2-479e-8361-6d0736eee46b"). InnerVolumeSpecName "kube-api-access-kp9pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.395471 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d1127c-77f2-479e-8361-6d0736eee46b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5d1127c-77f2-479e-8361-6d0736eee46b" (UID: "f5d1127c-77f2-479e-8361-6d0736eee46b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.498129 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5d1127c-77f2-479e-8361-6d0736eee46b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.498172 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp9pr\" (UniqueName: \"kubernetes.io/projected/f5d1127c-77f2-479e-8361-6d0736eee46b-kube-api-access-kp9pr\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.873762 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"71d47fb7b20b93f247d6033d9f9fe8d86099ef143c0663a00fa14d8a45ef5719"} Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.874348 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"c94892657face57c2735f26c4294c7199a7bc8188293293d24fb8ccbb8b4f8b8"} Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.874397 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"eac5d5c6c6dcb1be20edd769ff8a4caac63cec347b3af40a22ace13e180103d8"} Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.876363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7xkr-config-gvv6j" event={"ID":"480c6589-84e6-4994-814f-020b9fbfb555","Type":"ContainerStarted","Data":"4ff724144f51411d7d79c75a20f28baefb91fc871fbaf955bcfedf649995b9e1"} Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.876397 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7xkr-config-gvv6j" event={"ID":"480c6589-84e6-4994-814f-020b9fbfb555","Type":"ContainerStarted","Data":"dec782617f6ee9e2f662f9856bc2ae41cfcfba6d2966017691574aa5f0c17a59"} Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.878828 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9ltml" event={"ID":"f5d1127c-77f2-479e-8361-6d0736eee46b","Type":"ContainerDied","Data":"fb1cfc95935a79a6923c4d86633c5c5320c4f138b8905fe2b9b805383665716e"} Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.878870 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb1cfc95935a79a6923c4d86633c5c5320c4f138b8905fe2b9b805383665716e" Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.878873 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9ltml" Mar 10 15:26:20 crc kubenswrapper[4743]: I0310 15:26:20.903555 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-x7xkr-config-gvv6j" podStartSLOduration=1.903535022 podStartE2EDuration="1.903535022s" podCreationTimestamp="2026-03-10 15:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:20.896977177 +0000 UTC m=+1245.603791935" watchObservedRunningTime="2026-03-10 15:26:20.903535022 +0000 UTC m=+1245.610349770" Mar 10 15:26:21 crc kubenswrapper[4743]: I0310 15:26:21.888181 4743 generic.go:334] "Generic (PLEG): container finished" podID="480c6589-84e6-4994-814f-020b9fbfb555" containerID="4ff724144f51411d7d79c75a20f28baefb91fc871fbaf955bcfedf649995b9e1" exitCode=0 Mar 10 15:26:21 crc kubenswrapper[4743]: I0310 15:26:21.889014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7xkr-config-gvv6j" event={"ID":"480c6589-84e6-4994-814f-020b9fbfb555","Type":"ContainerDied","Data":"4ff724144f51411d7d79c75a20f28baefb91fc871fbaf955bcfedf649995b9e1"} Mar 10 15:26:22 crc kubenswrapper[4743]: I0310 15:26:22.905372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"e501fd3d21b226b72819ab0b65857268f8cd85a40ab86bd5d96cb19ab7b3f1b5"} Mar 10 15:26:22 crc kubenswrapper[4743]: I0310 15:26:22.906096 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"e19acde40c48feb63eca0d19737437d256b4a459b31c2ce94bfe50c0026d8019"} Mar 10 15:26:22 crc kubenswrapper[4743]: I0310 15:26:22.906113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"779635bcf9f3d960d6999cfd6a612ef09ad05d5aa32737379dcff990be35a106"} Mar 10 15:26:22 crc kubenswrapper[4743]: I0310 15:26:22.906125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"944e453bcbe109a189973d239f62943fa27dd6aaa57e679055b38d30b0372dde"} Mar 10 15:26:22 crc kubenswrapper[4743]: I0310 15:26:22.906137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"54c658e9ad8a66d916cf0dc584e4388c6485ebb989b3523beca988d7fad14777"} Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.211456 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.348692 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knxjc\" (UniqueName: \"kubernetes.io/projected/480c6589-84e6-4994-814f-020b9fbfb555-kube-api-access-knxjc\") pod \"480c6589-84e6-4994-814f-020b9fbfb555\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.348750 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run\") pod \"480c6589-84e6-4994-814f-020b9fbfb555\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.348889 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run-ovn\") pod \"480c6589-84e6-4994-814f-020b9fbfb555\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.348924 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-additional-scripts\") pod \"480c6589-84e6-4994-814f-020b9fbfb555\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.348980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-log-ovn\") pod \"480c6589-84e6-4994-814f-020b9fbfb555\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.349014 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-scripts\") pod \"480c6589-84e6-4994-814f-020b9fbfb555\" (UID: \"480c6589-84e6-4994-814f-020b9fbfb555\") " Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.350164 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run" (OuterVolumeSpecName: "var-run") pod "480c6589-84e6-4994-814f-020b9fbfb555" (UID: "480c6589-84e6-4994-814f-020b9fbfb555"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.350392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-scripts" (OuterVolumeSpecName: "scripts") pod "480c6589-84e6-4994-814f-020b9fbfb555" (UID: "480c6589-84e6-4994-814f-020b9fbfb555"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.350442 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "480c6589-84e6-4994-814f-020b9fbfb555" (UID: "480c6589-84e6-4994-814f-020b9fbfb555"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.350802 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "480c6589-84e6-4994-814f-020b9fbfb555" (UID: "480c6589-84e6-4994-814f-020b9fbfb555"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.350887 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "480c6589-84e6-4994-814f-020b9fbfb555" (UID: "480c6589-84e6-4994-814f-020b9fbfb555"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.352020 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x7xkr-config-gvv6j"] Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.353695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480c6589-84e6-4994-814f-020b9fbfb555-kube-api-access-knxjc" (OuterVolumeSpecName: "kube-api-access-knxjc") pod "480c6589-84e6-4994-814f-020b9fbfb555" (UID: "480c6589-84e6-4994-814f-020b9fbfb555"). InnerVolumeSpecName "kube-api-access-knxjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.359055 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-x7xkr-config-gvv6j"] Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.450672 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.450709 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.450719 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.450728 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/480c6589-84e6-4994-814f-020b9fbfb555-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.450736 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knxjc\" (UniqueName: \"kubernetes.io/projected/480c6589-84e6-4994-814f-020b9fbfb555-kube-api-access-knxjc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.450748 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/480c6589-84e6-4994-814f-020b9fbfb555-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.504579 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x7xkr-config-qc5c2"] Mar 10 15:26:23 crc kubenswrapper[4743]: E0310 15:26:23.505167 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d1127c-77f2-479e-8361-6d0736eee46b" containerName="mariadb-account-create-update" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.505198 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d1127c-77f2-479e-8361-6d0736eee46b" containerName="mariadb-account-create-update" Mar 10 15:26:23 crc kubenswrapper[4743]: E0310 15:26:23.505234 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480c6589-84e6-4994-814f-020b9fbfb555" containerName="ovn-config" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.505248 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="480c6589-84e6-4994-814f-020b9fbfb555" containerName="ovn-config" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.505624 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d1127c-77f2-479e-8361-6d0736eee46b" containerName="mariadb-account-create-update" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.505669 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="480c6589-84e6-4994-814f-020b9fbfb555" containerName="ovn-config" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.506524 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.514647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7xkr-config-qc5c2"] Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.653734 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-additional-scripts\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.653790 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hb5d\" (UniqueName: \"kubernetes.io/projected/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-kube-api-access-8hb5d\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.653897 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.653923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run-ovn\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.654105 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-log-ovn\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.654240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-scripts\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.756656 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-additional-scripts\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.756706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hb5d\" (UniqueName: \"kubernetes.io/projected/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-kube-api-access-8hb5d\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.756784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.756801 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run-ovn\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.756830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-log-ovn\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.756863 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-scripts\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.757216 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run-ovn\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.757240 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.757339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-log-ovn\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.757623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-additional-scripts\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.758770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-scripts\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.775858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hb5d\" (UniqueName: \"kubernetes.io/projected/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-kube-api-access-8hb5d\") pod \"ovn-controller-x7xkr-config-qc5c2\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.834625 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.925790 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr-config-gvv6j" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.927118 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480c6589-84e6-4994-814f-020b9fbfb555" path="/var/lib/kubelet/pods/480c6589-84e6-4994-814f-020b9fbfb555/volumes" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.927755 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"a4f90160b733a67e50ae11817ab25aff4696545ee96a472f52f42313b3a4c9a1"} Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.940516 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05770cd2-4275-4fcc-bd98-f8951c4d91ba","Type":"ContainerStarted","Data":"0e77d921e81e7ce13cde8deef9ba5b6e2535b1463aee0a3b2c5bb270e1b72ce0"} Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.940592 4743 scope.go:117] "RemoveContainer" containerID="4ff724144f51411d7d79c75a20f28baefb91fc871fbaf955bcfedf649995b9e1" Mar 10 15:26:23 crc kubenswrapper[4743]: I0310 15:26:23.971862 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.384333208 podStartE2EDuration="33.971840833s" podCreationTimestamp="2026-03-10 15:25:50 +0000 UTC" firstStartedPulling="2026-03-10 15:26:09.191070551 +0000 UTC m=+1233.897885299" lastFinishedPulling="2026-03-10 15:26:21.778578176 +0000 UTC m=+1246.485392924" observedRunningTime="2026-03-10 15:26:23.9703305 +0000 UTC m=+1248.677145268" watchObservedRunningTime="2026-03-10 15:26:23.971840833 +0000 UTC m=+1248.678655581" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.093226 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-x7xkr" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.285214 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdf9x"] Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.287159 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.290179 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.306365 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdf9x"] Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.370608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2lvw\" (UniqueName: \"kubernetes.io/projected/2b12b745-718e-4ee6-9751-ba679e3e274f-kube-api-access-s2lvw\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.370663 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.371205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.371564 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-config\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.371623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.371704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.417668 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7xkr-config-qc5c2"] Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.473527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2lvw\" (UniqueName: \"kubernetes.io/projected/2b12b745-718e-4ee6-9751-ba679e3e274f-kube-api-access-s2lvw\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.473916 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.473993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.474028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-config\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.474052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.474076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.474950 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.475006 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.476861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-config\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.476909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.477000 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.495007 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2lvw\" (UniqueName: \"kubernetes.io/projected/2b12b745-718e-4ee6-9751-ba679e3e274f-kube-api-access-s2lvw\") pod \"dnsmasq-dns-5c79d794d7-pdf9x\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.603933 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.936217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7xkr-config-qc5c2" event={"ID":"67ae6d46-b396-4ab7-bd09-e1dae4b34f96","Type":"ContainerStarted","Data":"1f1f3c45c769310510c2e6504caf2722b6faf4243fae7bb382cb9e90a21c66b8"} Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.936681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7xkr-config-qc5c2" event={"ID":"67ae6d46-b396-4ab7-bd09-e1dae4b34f96","Type":"ContainerStarted","Data":"c069efc47d17c78c82566aaee673135c87b45b76a68cb9ea0b554b9c5b88e7e4"} Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.955606 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-x7xkr-config-qc5c2" podStartSLOduration=1.955586185 podStartE2EDuration="1.955586185s" podCreationTimestamp="2026-03-10 15:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:24.954746931 +0000 UTC m=+1249.661561679" watchObservedRunningTime="2026-03-10 15:26:24.955586185 +0000 UTC m=+1249.662400933" Mar 10 15:26:24 crc kubenswrapper[4743]: I0310 15:26:24.985984 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.055510 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdf9x"] Mar 10 15:26:25 crc kubenswrapper[4743]: W0310 15:26:25.063961 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b12b745_718e_4ee6_9751_ba679e3e274f.slice/crio-8ec76bd88a3a4af845045370828acf97b1f34441ffb894d8717d4dd099e32e6c WatchSource:0}: Error finding container 8ec76bd88a3a4af845045370828acf97b1f34441ffb894d8717d4dd099e32e6c: Status 404 returned error can't find the container with id 8ec76bd88a3a4af845045370828acf97b1f34441ffb894d8717d4dd099e32e6c Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.444990 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.946502 4743 generic.go:334] "Generic (PLEG): container finished" podID="2b12b745-718e-4ee6-9751-ba679e3e274f" containerID="8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856" exitCode=0 Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.946625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" event={"ID":"2b12b745-718e-4ee6-9751-ba679e3e274f","Type":"ContainerDied","Data":"8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856"} Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.946685 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" event={"ID":"2b12b745-718e-4ee6-9751-ba679e3e274f","Type":"ContainerStarted","Data":"8ec76bd88a3a4af845045370828acf97b1f34441ffb894d8717d4dd099e32e6c"} Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.948788 4743 generic.go:334] "Generic (PLEG): container finished" podID="67ae6d46-b396-4ab7-bd09-e1dae4b34f96" containerID="1f1f3c45c769310510c2e6504caf2722b6faf4243fae7bb382cb9e90a21c66b8" exitCode=0 Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.948863 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7xkr-config-qc5c2" event={"ID":"67ae6d46-b396-4ab7-bd09-e1dae4b34f96","Type":"ContainerDied","Data":"1f1f3c45c769310510c2e6504caf2722b6faf4243fae7bb382cb9e90a21c66b8"} Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.951306 4743 generic.go:334] "Generic (PLEG): container finished" podID="496f6307-7603-4bfb-8524-86fd78005b43" containerID="0b6b14849e9c1267474dd716845908ea7eac67f43f201fe03212fd3307a38fed" exitCode=0 Mar 10 15:26:25 crc kubenswrapper[4743]: I0310 15:26:25.951341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zrnmr" event={"ID":"496f6307-7603-4bfb-8524-86fd78005b43","Type":"ContainerDied","Data":"0b6b14849e9c1267474dd716845908ea7eac67f43f201fe03212fd3307a38fed"} Mar 10 15:26:26 crc kubenswrapper[4743]: I0310 15:26:26.962079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" event={"ID":"2b12b745-718e-4ee6-9751-ba679e3e274f","Type":"ContainerStarted","Data":"f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce"} Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.003197 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" podStartSLOduration=3.003175107 podStartE2EDuration="3.003175107s" podCreationTimestamp="2026-03-10 15:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:26.993977656 +0000 UTC m=+1251.700792484" watchObservedRunningTime="2026-03-10 15:26:27.003175107 +0000 UTC m=+1251.709989865" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.391109 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.412725 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-n25pb"] Mar 10 15:26:27 crc kubenswrapper[4743]: E0310 15:26:27.413241 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ae6d46-b396-4ab7-bd09-e1dae4b34f96" containerName="ovn-config" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.413271 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ae6d46-b396-4ab7-bd09-e1dae4b34f96" containerName="ovn-config" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.413497 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ae6d46-b396-4ab7-bd09-e1dae4b34f96" containerName="ovn-config" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.414216 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.426863 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run-ovn\") pod \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.426982 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-log-ovn\") pod \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.427032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-scripts\") pod \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.427086 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run\") pod \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.427116 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-additional-scripts\") pod \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.427198 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hb5d\" (UniqueName: \"kubernetes.io/projected/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-kube-api-access-8hb5d\") pod \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\" (UID: \"67ae6d46-b396-4ab7-bd09-e1dae4b34f96\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.427436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cwh\" (UniqueName: \"kubernetes.io/projected/2f1124ae-ce38-4379-b981-cb509dc25ce7-kube-api-access-68cwh\") pod \"cinder-db-create-n25pb\" (UID: \"2f1124ae-ce38-4379-b981-cb509dc25ce7\") " pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.427521 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1124ae-ce38-4379-b981-cb509dc25ce7-operator-scripts\") pod \"cinder-db-create-n25pb\" (UID: \"2f1124ae-ce38-4379-b981-cb509dc25ce7\") " pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.427654 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run" (OuterVolumeSpecName: "var-run") pod "67ae6d46-b396-4ab7-bd09-e1dae4b34f96" (UID: "67ae6d46-b396-4ab7-bd09-e1dae4b34f96"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.429193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "67ae6d46-b396-4ab7-bd09-e1dae4b34f96" (UID: "67ae6d46-b396-4ab7-bd09-e1dae4b34f96"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.429246 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "67ae6d46-b396-4ab7-bd09-e1dae4b34f96" (UID: "67ae6d46-b396-4ab7-bd09-e1dae4b34f96"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.432350 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "67ae6d46-b396-4ab7-bd09-e1dae4b34f96" (UID: "67ae6d46-b396-4ab7-bd09-e1dae4b34f96"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.432911 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-scripts" (OuterVolumeSpecName: "scripts") pod "67ae6d46-b396-4ab7-bd09-e1dae4b34f96" (UID: "67ae6d46-b396-4ab7-bd09-e1dae4b34f96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.447994 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2567-account-create-update-vfqwd"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.449476 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.454453 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.456163 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-kube-api-access-8hb5d" (OuterVolumeSpecName: "kube-api-access-8hb5d") pod "67ae6d46-b396-4ab7-bd09-e1dae4b34f96" (UID: "67ae6d46-b396-4ab7-bd09-e1dae4b34f96"). InnerVolumeSpecName "kube-api-access-8hb5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.462167 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-n25pb"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.474427 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2567-account-create-update-vfqwd"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.521105 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-0647-account-create-update-wpz7z"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.522281 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.531588 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1124ae-ce38-4379-b981-cb509dc25ce7-operator-scripts\") pod \"cinder-db-create-n25pb\" (UID: \"2f1124ae-ce38-4379-b981-cb509dc25ce7\") " pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c6da81-8a78-4383-aa08-580c5242a582-operator-scripts\") pod \"cinder-2567-account-create-update-vfqwd\" (UID: \"82c6da81-8a78-4383-aa08-580c5242a582\") " pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540592 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68cwh\" (UniqueName: \"kubernetes.io/projected/2f1124ae-ce38-4379-b981-cb509dc25ce7-kube-api-access-68cwh\") pod \"cinder-db-create-n25pb\" (UID: \"2f1124ae-ce38-4379-b981-cb509dc25ce7\") " pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfb57\" (UniqueName: \"kubernetes.io/projected/82c6da81-8a78-4383-aa08-580c5242a582-kube-api-access-tfb57\") pod \"cinder-2567-account-create-update-vfqwd\" (UID: \"82c6da81-8a78-4383-aa08-580c5242a582\") " pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540707 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540717 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540726 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540734 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540742 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.540750 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hb5d\" (UniqueName: \"kubernetes.io/projected/67ae6d46-b396-4ab7-bd09-e1dae4b34f96-kube-api-access-8hb5d\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.541496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1124ae-ce38-4379-b981-cb509dc25ce7-operator-scripts\") pod \"cinder-db-create-n25pb\" (UID: \"2f1124ae-ce38-4379-b981-cb509dc25ce7\") " pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.573899 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x7xkr-config-qc5c2"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.580063 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cwh\" (UniqueName: \"kubernetes.io/projected/2f1124ae-ce38-4379-b981-cb509dc25ce7-kube-api-access-68cwh\") pod \"cinder-db-create-n25pb\" (UID: \"2f1124ae-ce38-4379-b981-cb509dc25ce7\") " pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.590193 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-x7xkr-config-qc5c2"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.598403 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-0647-account-create-update-wpz7z"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.605150 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-2rq5t"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.606137 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.622760 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-2rq5t"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.642254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-operator-scripts\") pod \"manila-0647-account-create-update-wpz7z\" (UID: \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\") " pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.642305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5kmj\" (UniqueName: \"kubernetes.io/projected/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-kube-api-access-j5kmj\") pod \"manila-0647-account-create-update-wpz7z\" (UID: \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\") " pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.642352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfb57\" (UniqueName: \"kubernetes.io/projected/82c6da81-8a78-4383-aa08-580c5242a582-kube-api-access-tfb57\") pod \"cinder-2567-account-create-update-vfqwd\" (UID: \"82c6da81-8a78-4383-aa08-580c5242a582\") " pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.642416 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfckk\" (UniqueName: \"kubernetes.io/projected/a6fe554d-63d8-4ba6-948b-d8658db57faa-kube-api-access-wfckk\") pod \"manila-db-create-2rq5t\" (UID: \"a6fe554d-63d8-4ba6-948b-d8658db57faa\") " pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.642464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c6da81-8a78-4383-aa08-580c5242a582-operator-scripts\") pod \"cinder-2567-account-create-update-vfqwd\" (UID: \"82c6da81-8a78-4383-aa08-580c5242a582\") " pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.642495 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fe554d-63d8-4ba6-948b-d8658db57faa-operator-scripts\") pod \"manila-db-create-2rq5t\" (UID: \"a6fe554d-63d8-4ba6-948b-d8658db57faa\") " pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.644156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c6da81-8a78-4383-aa08-580c5242a582-operator-scripts\") pod \"cinder-2567-account-create-update-vfqwd\" (UID: \"82c6da81-8a78-4383-aa08-580c5242a582\") " pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.685495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfb57\" (UniqueName: \"kubernetes.io/projected/82c6da81-8a78-4383-aa08-580c5242a582-kube-api-access-tfb57\") pod \"cinder-2567-account-create-update-vfqwd\" (UID: \"82c6da81-8a78-4383-aa08-580c5242a582\") " pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.714063 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7mqd4"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.715873 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.718828 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.718947 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.719303 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wwgbc" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.719839 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.719990 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7mqd4"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.740865 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.744189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dm4h\" (UniqueName: \"kubernetes.io/projected/49e25cc1-8829-47fa-8a68-9968a7ba8e75-kube-api-access-5dm4h\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.744256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfckk\" (UniqueName: \"kubernetes.io/projected/a6fe554d-63d8-4ba6-948b-d8658db57faa-kube-api-access-wfckk\") pod \"manila-db-create-2rq5t\" (UID: \"a6fe554d-63d8-4ba6-948b-d8658db57faa\") " pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.744311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-config-data\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.744337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-combined-ca-bundle\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.744368 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fe554d-63d8-4ba6-948b-d8658db57faa-operator-scripts\") pod \"manila-db-create-2rq5t\" (UID: \"a6fe554d-63d8-4ba6-948b-d8658db57faa\") " pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.744442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-operator-scripts\") pod \"manila-0647-account-create-update-wpz7z\" (UID: \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\") " pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.744472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5kmj\" (UniqueName: \"kubernetes.io/projected/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-kube-api-access-j5kmj\") pod \"manila-0647-account-create-update-wpz7z\" (UID: \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\") " pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.745527 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fe554d-63d8-4ba6-948b-d8658db57faa-operator-scripts\") pod \"manila-db-create-2rq5t\" (UID: \"a6fe554d-63d8-4ba6-948b-d8658db57faa\") " pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.745610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-operator-scripts\") pod \"manila-0647-account-create-update-wpz7z\" (UID: \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\") " pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.766338 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5kmj\" (UniqueName: \"kubernetes.io/projected/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-kube-api-access-j5kmj\") pod \"manila-0647-account-create-update-wpz7z\" (UID: \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\") " pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.766872 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfckk\" (UniqueName: \"kubernetes.io/projected/a6fe554d-63d8-4ba6-948b-d8658db57faa-kube-api-access-wfckk\") pod \"manila-db-create-2rq5t\" (UID: \"a6fe554d-63d8-4ba6-948b-d8658db57faa\") " pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.777287 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.816494 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.828636 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ng46f"] Mar 10 15:26:27 crc kubenswrapper[4743]: E0310 15:26:27.829099 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496f6307-7603-4bfb-8524-86fd78005b43" containerName="glance-db-sync" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.829114 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="496f6307-7603-4bfb-8524-86fd78005b43" containerName="glance-db-sync" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.829305 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="496f6307-7603-4bfb-8524-86fd78005b43" containerName="glance-db-sync" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.829941 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.845592 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-config-data\") pod \"496f6307-7603-4bfb-8524-86fd78005b43\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.845904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-combined-ca-bundle\") pod \"496f6307-7603-4bfb-8524-86fd78005b43\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.845947 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-db-sync-config-data\") pod \"496f6307-7603-4bfb-8524-86fd78005b43\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.845977 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgl2q\" (UniqueName: \"kubernetes.io/projected/496f6307-7603-4bfb-8524-86fd78005b43-kube-api-access-dgl2q\") pod \"496f6307-7603-4bfb-8524-86fd78005b43\" (UID: \"496f6307-7603-4bfb-8524-86fd78005b43\") " Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.847798 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dm4h\" (UniqueName: \"kubernetes.io/projected/49e25cc1-8829-47fa-8a68-9968a7ba8e75-kube-api-access-5dm4h\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.847861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-config-data\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.847885 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-combined-ca-bundle\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.852515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-combined-ca-bundle\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.858074 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "496f6307-7603-4bfb-8524-86fd78005b43" (UID: "496f6307-7603-4bfb-8524-86fd78005b43"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.872254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496f6307-7603-4bfb-8524-86fd78005b43-kube-api-access-dgl2q" (OuterVolumeSpecName: "kube-api-access-dgl2q") pod "496f6307-7603-4bfb-8524-86fd78005b43" (UID: "496f6307-7603-4bfb-8524-86fd78005b43"). InnerVolumeSpecName "kube-api-access-dgl2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.873901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.873929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-config-data\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.875020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ng46f"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.882366 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dm4h\" (UniqueName: \"kubernetes.io/projected/49e25cc1-8829-47fa-8a68-9968a7ba8e75-kube-api-access-5dm4h\") pod \"keystone-db-sync-7mqd4\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.936515 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "496f6307-7603-4bfb-8524-86fd78005b43" (UID: "496f6307-7603-4bfb-8524-86fd78005b43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.950459 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700303a1-61db-4669-9dba-31bba76aa8a5-operator-scripts\") pod \"barbican-db-create-ng46f\" (UID: \"700303a1-61db-4669-9dba-31bba76aa8a5\") " pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.950520 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnm72\" (UniqueName: \"kubernetes.io/projected/700303a1-61db-4669-9dba-31bba76aa8a5-kube-api-access-lnm72\") pod \"barbican-db-create-ng46f\" (UID: \"700303a1-61db-4669-9dba-31bba76aa8a5\") " pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.950587 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.950602 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgl2q\" (UniqueName: \"kubernetes.io/projected/496f6307-7603-4bfb-8524-86fd78005b43-kube-api-access-dgl2q\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.950618 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.950901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.959537 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ae6d46-b396-4ab7-bd09-e1dae4b34f96" path="/var/lib/kubelet/pods/67ae6d46-b396-4ab7-bd09-e1dae4b34f96/volumes" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.982752 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-config-data" (OuterVolumeSpecName: "config-data") pod "496f6307-7603-4bfb-8524-86fd78005b43" (UID: "496f6307-7603-4bfb-8524-86fd78005b43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.993793 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zm929"] Mar 10 15:26:27 crc kubenswrapper[4743]: I0310 15:26:27.995651 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zm929" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.017475 4743 scope.go:117] "RemoveContainer" containerID="1f1f3c45c769310510c2e6504caf2722b6faf4243fae7bb382cb9e90a21c66b8" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.017623 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7xkr-config-qc5c2" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.033274 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8d7e-account-create-update-8j226"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.034555 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.038546 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.043929 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.047761 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zrnmr" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.049982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zrnmr" event={"ID":"496f6307-7603-4bfb-8524-86fd78005b43","Type":"ContainerDied","Data":"09e8437bb3adda224bda1b20a109b76c2be3b59d2c44d12d99fd0186f79797ce"} Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.050025 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e8437bb3adda224bda1b20a109b76c2be3b59d2c44d12d99fd0186f79797ce" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.050047 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zm929"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.050068 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.060789 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlcp2\" (UniqueName: \"kubernetes.io/projected/ba22cc82-1acd-4b8b-b7a8-3d598262b490-kube-api-access-mlcp2\") pod \"neutron-8d7e-account-create-update-8j226\" (UID: \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\") " pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.060918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700303a1-61db-4669-9dba-31bba76aa8a5-operator-scripts\") pod \"barbican-db-create-ng46f\" (UID: \"700303a1-61db-4669-9dba-31bba76aa8a5\") " pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.060945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba22cc82-1acd-4b8b-b7a8-3d598262b490-operator-scripts\") pod \"neutron-8d7e-account-create-update-8j226\" (UID: \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\") " pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.060971 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk6z8\" (UniqueName: \"kubernetes.io/projected/5505a3d8-7971-4ab1-9d50-def772d62890-kube-api-access-nk6z8\") pod \"neutron-db-create-zm929\" (UID: \"5505a3d8-7971-4ab1-9d50-def772d62890\") " pod="openstack/neutron-db-create-zm929" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.060990 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5505a3d8-7971-4ab1-9d50-def772d62890-operator-scripts\") pod \"neutron-db-create-zm929\" (UID: \"5505a3d8-7971-4ab1-9d50-def772d62890\") " pod="openstack/neutron-db-create-zm929" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.061007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnm72\" (UniqueName: \"kubernetes.io/projected/700303a1-61db-4669-9dba-31bba76aa8a5-kube-api-access-lnm72\") pod \"barbican-db-create-ng46f\" (UID: \"700303a1-61db-4669-9dba-31bba76aa8a5\") " pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.061059 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496f6307-7603-4bfb-8524-86fd78005b43-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.062128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700303a1-61db-4669-9dba-31bba76aa8a5-operator-scripts\") pod \"barbican-db-create-ng46f\" (UID: \"700303a1-61db-4669-9dba-31bba76aa8a5\") " pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.091014 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnm72\" (UniqueName: \"kubernetes.io/projected/700303a1-61db-4669-9dba-31bba76aa8a5-kube-api-access-lnm72\") pod \"barbican-db-create-ng46f\" (UID: \"700303a1-61db-4669-9dba-31bba76aa8a5\") " pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.094495 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8d7e-account-create-update-8j226"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.115340 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1bfd-account-create-update-5nmxw"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.116864 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.120355 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.149568 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1bfd-account-create-update-5nmxw"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.163062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba22cc82-1acd-4b8b-b7a8-3d598262b490-operator-scripts\") pod \"neutron-8d7e-account-create-update-8j226\" (UID: \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\") " pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.163134 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6z8\" (UniqueName: \"kubernetes.io/projected/5505a3d8-7971-4ab1-9d50-def772d62890-kube-api-access-nk6z8\") pod \"neutron-db-create-zm929\" (UID: \"5505a3d8-7971-4ab1-9d50-def772d62890\") " pod="openstack/neutron-db-create-zm929" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.163174 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5505a3d8-7971-4ab1-9d50-def772d62890-operator-scripts\") pod \"neutron-db-create-zm929\" (UID: \"5505a3d8-7971-4ab1-9d50-def772d62890\") " pod="openstack/neutron-db-create-zm929" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.163267 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlcp2\" (UniqueName: \"kubernetes.io/projected/ba22cc82-1acd-4b8b-b7a8-3d598262b490-kube-api-access-mlcp2\") pod \"neutron-8d7e-account-create-update-8j226\" (UID: \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\") " pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.163318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9298q\" (UniqueName: \"kubernetes.io/projected/45b11c74-122b-4da8-a2e2-040fab849d4b-kube-api-access-9298q\") pod \"barbican-1bfd-account-create-update-5nmxw\" (UID: \"45b11c74-122b-4da8-a2e2-040fab849d4b\") " pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.163436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45b11c74-122b-4da8-a2e2-040fab849d4b-operator-scripts\") pod \"barbican-1bfd-account-create-update-5nmxw\" (UID: \"45b11c74-122b-4da8-a2e2-040fab849d4b\") " pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.165703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba22cc82-1acd-4b8b-b7a8-3d598262b490-operator-scripts\") pod \"neutron-8d7e-account-create-update-8j226\" (UID: \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\") " pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.169030 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5505a3d8-7971-4ab1-9d50-def772d62890-operator-scripts\") pod \"neutron-db-create-zm929\" (UID: \"5505a3d8-7971-4ab1-9d50-def772d62890\") " pod="openstack/neutron-db-create-zm929" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.185294 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlcp2\" (UniqueName: \"kubernetes.io/projected/ba22cc82-1acd-4b8b-b7a8-3d598262b490-kube-api-access-mlcp2\") pod \"neutron-8d7e-account-create-update-8j226\" (UID: \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\") " pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.192370 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.194003 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk6z8\" (UniqueName: \"kubernetes.io/projected/5505a3d8-7971-4ab1-9d50-def772d62890-kube-api-access-nk6z8\") pod \"neutron-db-create-zm929\" (UID: \"5505a3d8-7971-4ab1-9d50-def772d62890\") " pod="openstack/neutron-db-create-zm929" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.266068 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9298q\" (UniqueName: \"kubernetes.io/projected/45b11c74-122b-4da8-a2e2-040fab849d4b-kube-api-access-9298q\") pod \"barbican-1bfd-account-create-update-5nmxw\" (UID: \"45b11c74-122b-4da8-a2e2-040fab849d4b\") " pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.266182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45b11c74-122b-4da8-a2e2-040fab849d4b-operator-scripts\") pod \"barbican-1bfd-account-create-update-5nmxw\" (UID: \"45b11c74-122b-4da8-a2e2-040fab849d4b\") " pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.267151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45b11c74-122b-4da8-a2e2-040fab849d4b-operator-scripts\") pod \"barbican-1bfd-account-create-update-5nmxw\" (UID: \"45b11c74-122b-4da8-a2e2-040fab849d4b\") " pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.309644 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9298q\" (UniqueName: \"kubernetes.io/projected/45b11c74-122b-4da8-a2e2-040fab849d4b-kube-api-access-9298q\") pod \"barbican-1bfd-account-create-update-5nmxw\" (UID: \"45b11c74-122b-4da8-a2e2-040fab849d4b\") " pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.347413 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zm929" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.380835 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.444279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.565234 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-n25pb"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.612118 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-0647-account-create-update-wpz7z"] Mar 10 15:26:28 crc kubenswrapper[4743]: W0310 15:26:28.612999 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1124ae_ce38_4379_b981_cb509dc25ce7.slice/crio-a09d4541ff5deaa758dcc0abd38d922857e11b61e25935a2494ee907cfe35021 WatchSource:0}: Error finding container a09d4541ff5deaa758dcc0abd38d922857e11b61e25935a2494ee907cfe35021: Status 404 returned error can't find the container with id a09d4541ff5deaa758dcc0abd38d922857e11b61e25935a2494ee907cfe35021 Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.648277 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2567-account-create-update-vfqwd"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.676378 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdf9x"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.724281 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-dmqvp"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.725722 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.781957 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-config\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.782055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.782231 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.782254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.782363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t89x\" (UniqueName: \"kubernetes.io/projected/219b9d1c-8e83-4f19-9163-ebb0ef8df490-kube-api-access-6t89x\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.782394 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.788258 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-dmqvp"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.855161 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-2rq5t"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.883603 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7mqd4"] Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.884106 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t89x\" (UniqueName: \"kubernetes.io/projected/219b9d1c-8e83-4f19-9163-ebb0ef8df490-kube-api-access-6t89x\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.884157 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.884205 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-config\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.884240 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.884292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.884310 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.885194 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.885641 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-config\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.885742 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.886330 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.886521 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:28 crc kubenswrapper[4743]: I0310 15:26:28.924069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t89x\" (UniqueName: \"kubernetes.io/projected/219b9d1c-8e83-4f19-9163-ebb0ef8df490-kube-api-access-6t89x\") pod \"dnsmasq-dns-5f59b8f679-dmqvp\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.077806 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2567-account-create-update-vfqwd" event={"ID":"82c6da81-8a78-4383-aa08-580c5242a582","Type":"ContainerStarted","Data":"45b35d14969bcda7fd3dbd2f31e97ad980817a2878f201f350ea7fe267531e7b"} Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.085684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-0647-account-create-update-wpz7z" event={"ID":"43962c3a-61ca-4e7c-b56c-d9eefcaddeee","Type":"ContainerStarted","Data":"ad18c91e8ae7676fb1c5f1232e558f823716fb086530ea7a5543c3f7309823c8"} Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.088380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7mqd4" event={"ID":"49e25cc1-8829-47fa-8a68-9968a7ba8e75","Type":"ContainerStarted","Data":"daf3ee9ebf9f262fd392c04d78729c08f885d5eb8ba90f725afa3a0abba09c62"} Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.092308 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.093364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-n25pb" event={"ID":"2f1124ae-ce38-4379-b981-cb509dc25ce7","Type":"ContainerStarted","Data":"a09d4541ff5deaa758dcc0abd38d922857e11b61e25935a2494ee907cfe35021"} Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.095785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-2rq5t" event={"ID":"a6fe554d-63d8-4ba6-948b-d8658db57faa","Type":"ContainerStarted","Data":"bf0bb7f4688dccd19c9c72930f322ee8ae379174bcb1821920214f8d9710da26"} Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.136377 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ng46f"] Mar 10 15:26:29 crc kubenswrapper[4743]: W0310 15:26:29.179395 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod700303a1_61db_4669_9dba_31bba76aa8a5.slice/crio-0f86c539cfbab69ab08ff140606dc8477310578b96cbb1eb9d422a101764054a WatchSource:0}: Error finding container 0f86c539cfbab69ab08ff140606dc8477310578b96cbb1eb9d422a101764054a: Status 404 returned error can't find the container with id 0f86c539cfbab69ab08ff140606dc8477310578b96cbb1eb9d422a101764054a Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.275231 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1bfd-account-create-update-5nmxw"] Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.389463 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zm929"] Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.412026 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8d7e-account-create-update-8j226"] Mar 10 15:26:29 crc kubenswrapper[4743]: W0310 15:26:29.423234 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba22cc82_1acd_4b8b_b7a8_3d598262b490.slice/crio-c7bef9048907cf22fea72cb603ca4f5185f102933023cafa68fce7ad1177fd3f WatchSource:0}: Error finding container c7bef9048907cf22fea72cb603ca4f5185f102933023cafa68fce7ad1177fd3f: Status 404 returned error can't find the container with id c7bef9048907cf22fea72cb603ca4f5185f102933023cafa68fce7ad1177fd3f Mar 10 15:26:29 crc kubenswrapper[4743]: I0310 15:26:29.713850 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-dmqvp"] Mar 10 15:26:29 crc kubenswrapper[4743]: W0310 15:26:29.913608 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod219b9d1c_8e83_4f19_9163_ebb0ef8df490.slice/crio-14f8d54ad686e3d155d897471f2a8d5a879196a663a8b9c789ea102c7dc37e8f WatchSource:0}: Error finding container 14f8d54ad686e3d155d897471f2a8d5a879196a663a8b9c789ea102c7dc37e8f: Status 404 returned error can't find the container with id 14f8d54ad686e3d155d897471f2a8d5a879196a663a8b9c789ea102c7dc37e8f Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.108108 4743 generic.go:334] "Generic (PLEG): container finished" podID="5505a3d8-7971-4ab1-9d50-def772d62890" containerID="c0bc65b83909e55b7c6036d7051811bb7a0bc331c0a7957b851b97054df50db3" exitCode=0 Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.108189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zm929" event={"ID":"5505a3d8-7971-4ab1-9d50-def772d62890","Type":"ContainerDied","Data":"c0bc65b83909e55b7c6036d7051811bb7a0bc331c0a7957b851b97054df50db3"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.108391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zm929" event={"ID":"5505a3d8-7971-4ab1-9d50-def772d62890","Type":"ContainerStarted","Data":"404ddf39cab942af14916429103804fe93d0db79076d26e7ef1108e2280834ef"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.110088 4743 generic.go:334] "Generic (PLEG): container finished" podID="82c6da81-8a78-4383-aa08-580c5242a582" containerID="978071ec4fd79d6c2c9c0233a5371cf76049c7f9464452f18d9c6f3ffb2be1df" exitCode=0 Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.110156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2567-account-create-update-vfqwd" event={"ID":"82c6da81-8a78-4383-aa08-580c5242a582","Type":"ContainerDied","Data":"978071ec4fd79d6c2c9c0233a5371cf76049c7f9464452f18d9c6f3ffb2be1df"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.111961 4743 generic.go:334] "Generic (PLEG): container finished" podID="43962c3a-61ca-4e7c-b56c-d9eefcaddeee" containerID="a2db8459c8d99f4e1accffe0b6e2e05db562910b5566931f52012c1f77126af5" exitCode=0 Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.112017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-0647-account-create-update-wpz7z" event={"ID":"43962c3a-61ca-4e7c-b56c-d9eefcaddeee","Type":"ContainerDied","Data":"a2db8459c8d99f4e1accffe0b6e2e05db562910b5566931f52012c1f77126af5"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.114805 4743 generic.go:334] "Generic (PLEG): container finished" podID="2f1124ae-ce38-4379-b981-cb509dc25ce7" containerID="300da71cb91d0ac1a2302b90060f8f7de6cafc2774f32758184775b8fe3a4dd0" exitCode=0 Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.114922 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-n25pb" event={"ID":"2f1124ae-ce38-4379-b981-cb509dc25ce7","Type":"ContainerDied","Data":"300da71cb91d0ac1a2302b90060f8f7de6cafc2774f32758184775b8fe3a4dd0"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.116637 4743 generic.go:334] "Generic (PLEG): container finished" podID="a6fe554d-63d8-4ba6-948b-d8658db57faa" containerID="237cfa3f4796c9a423cf4553297b98768d36c0552b15995398fff3d8f1578a1f" exitCode=0 Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.116698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-2rq5t" event={"ID":"a6fe554d-63d8-4ba6-948b-d8658db57faa","Type":"ContainerDied","Data":"237cfa3f4796c9a423cf4553297b98768d36c0552b15995398fff3d8f1578a1f"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.124375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" event={"ID":"219b9d1c-8e83-4f19-9163-ebb0ef8df490","Type":"ContainerStarted","Data":"14f8d54ad686e3d155d897471f2a8d5a879196a663a8b9c789ea102c7dc37e8f"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.133372 4743 generic.go:334] "Generic (PLEG): container finished" podID="700303a1-61db-4669-9dba-31bba76aa8a5" containerID="84a396c96dc27e777c7f9314a1540f6da393210e011f6a9974e57b005fcaa3a5" exitCode=0 Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.133416 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ng46f" event={"ID":"700303a1-61db-4669-9dba-31bba76aa8a5","Type":"ContainerDied","Data":"84a396c96dc27e777c7f9314a1540f6da393210e011f6a9974e57b005fcaa3a5"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.133473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ng46f" event={"ID":"700303a1-61db-4669-9dba-31bba76aa8a5","Type":"ContainerStarted","Data":"0f86c539cfbab69ab08ff140606dc8477310578b96cbb1eb9d422a101764054a"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.135792 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba22cc82-1acd-4b8b-b7a8-3d598262b490" containerID="069cd101a1566d81e82a72fbc921a8de11938c3ea8594382fdfeac009bc71493" exitCode=0 Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.135869 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d7e-account-create-update-8j226" event={"ID":"ba22cc82-1acd-4b8b-b7a8-3d598262b490","Type":"ContainerDied","Data":"069cd101a1566d81e82a72fbc921a8de11938c3ea8594382fdfeac009bc71493"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.135899 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d7e-account-create-update-8j226" event={"ID":"ba22cc82-1acd-4b8b-b7a8-3d598262b490","Type":"ContainerStarted","Data":"c7bef9048907cf22fea72cb603ca4f5185f102933023cafa68fce7ad1177fd3f"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.145721 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" podUID="2b12b745-718e-4ee6-9751-ba679e3e274f" containerName="dnsmasq-dns" containerID="cri-o://f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce" gracePeriod=10 Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.146961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1bfd-account-create-update-5nmxw" event={"ID":"45b11c74-122b-4da8-a2e2-040fab849d4b","Type":"ContainerStarted","Data":"8c8f0d92e486410a9d69d1c785aeafdd6070e8e2f2f256afd68a7e43cc265b33"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.146994 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1bfd-account-create-update-5nmxw" event={"ID":"45b11c74-122b-4da8-a2e2-040fab849d4b","Type":"ContainerStarted","Data":"2b3a7dd622e3648c81cb3596e4afa8cd3c8fd640e00f56bd56c1545de64cc9ae"} Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.418637 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-1bfd-account-create-update-5nmxw" podStartSLOduration=3.41861268 podStartE2EDuration="3.41861268s" podCreationTimestamp="2026-03-10 15:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:30.409135471 +0000 UTC m=+1255.115950219" watchObservedRunningTime="2026-03-10 15:26:30.41861268 +0000 UTC m=+1255.125427428" Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.811077 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.941508 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-svc\") pod \"2b12b745-718e-4ee6-9751-ba679e3e274f\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.941619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2lvw\" (UniqueName: \"kubernetes.io/projected/2b12b745-718e-4ee6-9751-ba679e3e274f-kube-api-access-s2lvw\") pod \"2b12b745-718e-4ee6-9751-ba679e3e274f\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.941723 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-sb\") pod \"2b12b745-718e-4ee6-9751-ba679e3e274f\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.941826 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-nb\") pod \"2b12b745-718e-4ee6-9751-ba679e3e274f\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.941970 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-config\") pod \"2b12b745-718e-4ee6-9751-ba679e3e274f\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.942033 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-swift-storage-0\") pod \"2b12b745-718e-4ee6-9751-ba679e3e274f\" (UID: \"2b12b745-718e-4ee6-9751-ba679e3e274f\") " Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.949889 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b12b745-718e-4ee6-9751-ba679e3e274f-kube-api-access-s2lvw" (OuterVolumeSpecName: "kube-api-access-s2lvw") pod "2b12b745-718e-4ee6-9751-ba679e3e274f" (UID: "2b12b745-718e-4ee6-9751-ba679e3e274f"). InnerVolumeSpecName "kube-api-access-s2lvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.984263 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2b12b745-718e-4ee6-9751-ba679e3e274f" (UID: "2b12b745-718e-4ee6-9751-ba679e3e274f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.986237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b12b745-718e-4ee6-9751-ba679e3e274f" (UID: "2b12b745-718e-4ee6-9751-ba679e3e274f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:30 crc kubenswrapper[4743]: I0310 15:26:30.995365 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b12b745-718e-4ee6-9751-ba679e3e274f" (UID: "2b12b745-718e-4ee6-9751-ba679e3e274f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.000927 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-config" (OuterVolumeSpecName: "config") pod "2b12b745-718e-4ee6-9751-ba679e3e274f" (UID: "2b12b745-718e-4ee6-9751-ba679e3e274f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.004743 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b12b745-718e-4ee6-9751-ba679e3e274f" (UID: "2b12b745-718e-4ee6-9751-ba679e3e274f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.045177 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.045222 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.045236 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.045248 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.045258 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2lvw\" (UniqueName: \"kubernetes.io/projected/2b12b745-718e-4ee6-9751-ba679e3e274f-kube-api-access-s2lvw\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.045272 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b12b745-718e-4ee6-9751-ba679e3e274f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.157666 4743 generic.go:334] "Generic (PLEG): container finished" podID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerID="982511c686a9bee8ccd48256adb2e7a9032ce7e3d71f305e642c126cc1bacb74" exitCode=0 Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.158089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" event={"ID":"219b9d1c-8e83-4f19-9163-ebb0ef8df490","Type":"ContainerDied","Data":"982511c686a9bee8ccd48256adb2e7a9032ce7e3d71f305e642c126cc1bacb74"} Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.160924 4743 generic.go:334] "Generic (PLEG): container finished" podID="45b11c74-122b-4da8-a2e2-040fab849d4b" containerID="8c8f0d92e486410a9d69d1c785aeafdd6070e8e2f2f256afd68a7e43cc265b33" exitCode=0 Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.161014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1bfd-account-create-update-5nmxw" event={"ID":"45b11c74-122b-4da8-a2e2-040fab849d4b","Type":"ContainerDied","Data":"8c8f0d92e486410a9d69d1c785aeafdd6070e8e2f2f256afd68a7e43cc265b33"} Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.164987 4743 generic.go:334] "Generic (PLEG): container finished" podID="2b12b745-718e-4ee6-9751-ba679e3e274f" containerID="f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce" exitCode=0 Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.165291 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.168425 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" event={"ID":"2b12b745-718e-4ee6-9751-ba679e3e274f","Type":"ContainerDied","Data":"f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce"} Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.168479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdf9x" event={"ID":"2b12b745-718e-4ee6-9751-ba679e3e274f","Type":"ContainerDied","Data":"8ec76bd88a3a4af845045370828acf97b1f34441ffb894d8717d4dd099e32e6c"} Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.168499 4743 scope.go:117] "RemoveContainer" containerID="f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.235481 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdf9x"] Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.238647 4743 scope.go:117] "RemoveContainer" containerID="8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.251013 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdf9x"] Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.326322 4743 scope.go:117] "RemoveContainer" containerID="f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce" Mar 10 15:26:31 crc kubenswrapper[4743]: E0310 15:26:31.331218 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce\": container with ID starting with f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce not found: ID does not exist" containerID="f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.331258 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce"} err="failed to get container status \"f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce\": rpc error: code = NotFound desc = could not find container \"f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce\": container with ID starting with f82078ce6f2b298e59e5def4eb2a99395585b767dda903a1acf192252e5124ce not found: ID does not exist" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.331286 4743 scope.go:117] "RemoveContainer" containerID="8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856" Mar 10 15:26:31 crc kubenswrapper[4743]: E0310 15:26:31.335874 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856\": container with ID starting with 8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856 not found: ID does not exist" containerID="8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.335912 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856"} err="failed to get container status \"8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856\": rpc error: code = NotFound desc = could not find container \"8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856\": container with ID starting with 8219a58c1737b265cf648a35851280fa1762ffd0ada6755b064bb61db3ef7856 not found: ID does not exist" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.652853 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.794418 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-operator-scripts\") pod \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\" (UID: \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.794475 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5kmj\" (UniqueName: \"kubernetes.io/projected/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-kube-api-access-j5kmj\") pod \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\" (UID: \"43962c3a-61ca-4e7c-b56c-d9eefcaddeee\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.796134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43962c3a-61ca-4e7c-b56c-d9eefcaddeee" (UID: "43962c3a-61ca-4e7c-b56c-d9eefcaddeee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.808184 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-kube-api-access-j5kmj" (OuterVolumeSpecName: "kube-api-access-j5kmj") pod "43962c3a-61ca-4e7c-b56c-d9eefcaddeee" (UID: "43962c3a-61ca-4e7c-b56c-d9eefcaddeee"). InnerVolumeSpecName "kube-api-access-j5kmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.824196 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.836801 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zm929" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.856942 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.889314 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.896986 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.897024 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5kmj\" (UniqueName: \"kubernetes.io/projected/43962c3a-61ca-4e7c-b56c-d9eefcaddeee-kube-api-access-j5kmj\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.898329 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.924661 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.946592 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b12b745-718e-4ee6-9751-ba679e3e274f" path="/var/lib/kubelet/pods/2b12b745-718e-4ee6-9751-ba679e3e274f/volumes" Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998157 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c6da81-8a78-4383-aa08-580c5242a582-operator-scripts\") pod \"82c6da81-8a78-4383-aa08-580c5242a582\" (UID: \"82c6da81-8a78-4383-aa08-580c5242a582\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68cwh\" (UniqueName: \"kubernetes.io/projected/2f1124ae-ce38-4379-b981-cb509dc25ce7-kube-api-access-68cwh\") pod \"2f1124ae-ce38-4379-b981-cb509dc25ce7\" (UID: \"2f1124ae-ce38-4379-b981-cb509dc25ce7\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998309 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlcp2\" (UniqueName: \"kubernetes.io/projected/ba22cc82-1acd-4b8b-b7a8-3d598262b490-kube-api-access-mlcp2\") pod \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\" (UID: \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998375 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnm72\" (UniqueName: \"kubernetes.io/projected/700303a1-61db-4669-9dba-31bba76aa8a5-kube-api-access-lnm72\") pod \"700303a1-61db-4669-9dba-31bba76aa8a5\" (UID: \"700303a1-61db-4669-9dba-31bba76aa8a5\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998402 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfb57\" (UniqueName: \"kubernetes.io/projected/82c6da81-8a78-4383-aa08-580c5242a582-kube-api-access-tfb57\") pod \"82c6da81-8a78-4383-aa08-580c5242a582\" (UID: \"82c6da81-8a78-4383-aa08-580c5242a582\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998450 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5505a3d8-7971-4ab1-9d50-def772d62890-operator-scripts\") pod \"5505a3d8-7971-4ab1-9d50-def772d62890\" (UID: \"5505a3d8-7971-4ab1-9d50-def772d62890\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998531 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba22cc82-1acd-4b8b-b7a8-3d598262b490-operator-scripts\") pod \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\" (UID: \"ba22cc82-1acd-4b8b-b7a8-3d598262b490\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1124ae-ce38-4379-b981-cb509dc25ce7-operator-scripts\") pod \"2f1124ae-ce38-4379-b981-cb509dc25ce7\" (UID: \"2f1124ae-ce38-4379-b981-cb509dc25ce7\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700303a1-61db-4669-9dba-31bba76aa8a5-operator-scripts\") pod \"700303a1-61db-4669-9dba-31bba76aa8a5\" (UID: \"700303a1-61db-4669-9dba-31bba76aa8a5\") " Mar 10 15:26:31 crc kubenswrapper[4743]: I0310 15:26:31.998734 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk6z8\" (UniqueName: \"kubernetes.io/projected/5505a3d8-7971-4ab1-9d50-def772d62890-kube-api-access-nk6z8\") pod \"5505a3d8-7971-4ab1-9d50-def772d62890\" (UID: \"5505a3d8-7971-4ab1-9d50-def772d62890\") " Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.001527 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5505a3d8-7971-4ab1-9d50-def772d62890-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5505a3d8-7971-4ab1-9d50-def772d62890" (UID: "5505a3d8-7971-4ab1-9d50-def772d62890"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.002934 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba22cc82-1acd-4b8b-b7a8-3d598262b490-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba22cc82-1acd-4b8b-b7a8-3d598262b490" (UID: "ba22cc82-1acd-4b8b-b7a8-3d598262b490"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.005723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba22cc82-1acd-4b8b-b7a8-3d598262b490-kube-api-access-mlcp2" (OuterVolumeSpecName: "kube-api-access-mlcp2") pod "ba22cc82-1acd-4b8b-b7a8-3d598262b490" (UID: "ba22cc82-1acd-4b8b-b7a8-3d598262b490"). InnerVolumeSpecName "kube-api-access-mlcp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.006138 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c6da81-8a78-4383-aa08-580c5242a582-kube-api-access-tfb57" (OuterVolumeSpecName: "kube-api-access-tfb57") pod "82c6da81-8a78-4383-aa08-580c5242a582" (UID: "82c6da81-8a78-4383-aa08-580c5242a582"). InnerVolumeSpecName "kube-api-access-tfb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.006463 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82c6da81-8a78-4383-aa08-580c5242a582-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82c6da81-8a78-4383-aa08-580c5242a582" (UID: "82c6da81-8a78-4383-aa08-580c5242a582"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.007126 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/700303a1-61db-4669-9dba-31bba76aa8a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "700303a1-61db-4669-9dba-31bba76aa8a5" (UID: "700303a1-61db-4669-9dba-31bba76aa8a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.007560 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1124ae-ce38-4379-b981-cb509dc25ce7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f1124ae-ce38-4379-b981-cb509dc25ce7" (UID: "2f1124ae-ce38-4379-b981-cb509dc25ce7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.009443 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1124ae-ce38-4379-b981-cb509dc25ce7-kube-api-access-68cwh" (OuterVolumeSpecName: "kube-api-access-68cwh") pod "2f1124ae-ce38-4379-b981-cb509dc25ce7" (UID: "2f1124ae-ce38-4379-b981-cb509dc25ce7"). InnerVolumeSpecName "kube-api-access-68cwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.010411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700303a1-61db-4669-9dba-31bba76aa8a5-kube-api-access-lnm72" (OuterVolumeSpecName: "kube-api-access-lnm72") pod "700303a1-61db-4669-9dba-31bba76aa8a5" (UID: "700303a1-61db-4669-9dba-31bba76aa8a5"). InnerVolumeSpecName "kube-api-access-lnm72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.010695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5505a3d8-7971-4ab1-9d50-def772d62890-kube-api-access-nk6z8" (OuterVolumeSpecName: "kube-api-access-nk6z8") pod "5505a3d8-7971-4ab1-9d50-def772d62890" (UID: "5505a3d8-7971-4ab1-9d50-def772d62890"). InnerVolumeSpecName "kube-api-access-nk6z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.100211 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfckk\" (UniqueName: \"kubernetes.io/projected/a6fe554d-63d8-4ba6-948b-d8658db57faa-kube-api-access-wfckk\") pod \"a6fe554d-63d8-4ba6-948b-d8658db57faa\" (UID: \"a6fe554d-63d8-4ba6-948b-d8658db57faa\") " Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.100254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fe554d-63d8-4ba6-948b-d8658db57faa-operator-scripts\") pod \"a6fe554d-63d8-4ba6-948b-d8658db57faa\" (UID: \"a6fe554d-63d8-4ba6-948b-d8658db57faa\") " Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101278 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c6da81-8a78-4383-aa08-580c5242a582-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101302 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68cwh\" (UniqueName: \"kubernetes.io/projected/2f1124ae-ce38-4379-b981-cb509dc25ce7-kube-api-access-68cwh\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101313 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlcp2\" (UniqueName: \"kubernetes.io/projected/ba22cc82-1acd-4b8b-b7a8-3d598262b490-kube-api-access-mlcp2\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101323 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnm72\" (UniqueName: \"kubernetes.io/projected/700303a1-61db-4669-9dba-31bba76aa8a5-kube-api-access-lnm72\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101331 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfb57\" (UniqueName: \"kubernetes.io/projected/82c6da81-8a78-4383-aa08-580c5242a582-kube-api-access-tfb57\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101340 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5505a3d8-7971-4ab1-9d50-def772d62890-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101349 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba22cc82-1acd-4b8b-b7a8-3d598262b490-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101357 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1124ae-ce38-4379-b981-cb509dc25ce7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101365 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700303a1-61db-4669-9dba-31bba76aa8a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.101373 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk6z8\" (UniqueName: \"kubernetes.io/projected/5505a3d8-7971-4ab1-9d50-def772d62890-kube-api-access-nk6z8\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.104264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fe554d-63d8-4ba6-948b-d8658db57faa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6fe554d-63d8-4ba6-948b-d8658db57faa" (UID: "a6fe554d-63d8-4ba6-948b-d8658db57faa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.106455 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6fe554d-63d8-4ba6-948b-d8658db57faa-kube-api-access-wfckk" (OuterVolumeSpecName: "kube-api-access-wfckk") pod "a6fe554d-63d8-4ba6-948b-d8658db57faa" (UID: "a6fe554d-63d8-4ba6-948b-d8658db57faa"). InnerVolumeSpecName "kube-api-access-wfckk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.178355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zm929" event={"ID":"5505a3d8-7971-4ab1-9d50-def772d62890","Type":"ContainerDied","Data":"404ddf39cab942af14916429103804fe93d0db79076d26e7ef1108e2280834ef"} Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.179287 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="404ddf39cab942af14916429103804fe93d0db79076d26e7ef1108e2280834ef" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.179404 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zm929" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.189112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2567-account-create-update-vfqwd" event={"ID":"82c6da81-8a78-4383-aa08-580c5242a582","Type":"ContainerDied","Data":"45b35d14969bcda7fd3dbd2f31e97ad980817a2878f201f350ea7fe267531e7b"} Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.189154 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45b35d14969bcda7fd3dbd2f31e97ad980817a2878f201f350ea7fe267531e7b" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.189170 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2567-account-create-update-vfqwd" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.190645 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ng46f" event={"ID":"700303a1-61db-4669-9dba-31bba76aa8a5","Type":"ContainerDied","Data":"0f86c539cfbab69ab08ff140606dc8477310578b96cbb1eb9d422a101764054a"} Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.190666 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f86c539cfbab69ab08ff140606dc8477310578b96cbb1eb9d422a101764054a" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.190722 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ng46f" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.204233 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfckk\" (UniqueName: \"kubernetes.io/projected/a6fe554d-63d8-4ba6-948b-d8658db57faa-kube-api-access-wfckk\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.205444 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fe554d-63d8-4ba6-948b-d8658db57faa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.223320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d7e-account-create-update-8j226" event={"ID":"ba22cc82-1acd-4b8b-b7a8-3d598262b490","Type":"ContainerDied","Data":"c7bef9048907cf22fea72cb603ca4f5185f102933023cafa68fce7ad1177fd3f"} Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.223385 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7bef9048907cf22fea72cb603ca4f5185f102933023cafa68fce7ad1177fd3f" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.223336 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8d7e-account-create-update-8j226" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.228248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-n25pb" event={"ID":"2f1124ae-ce38-4379-b981-cb509dc25ce7","Type":"ContainerDied","Data":"a09d4541ff5deaa758dcc0abd38d922857e11b61e25935a2494ee907cfe35021"} Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.228291 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09d4541ff5deaa758dcc0abd38d922857e11b61e25935a2494ee907cfe35021" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.228295 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-n25pb" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.229893 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-2rq5t" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.230917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-2rq5t" event={"ID":"a6fe554d-63d8-4ba6-948b-d8658db57faa","Type":"ContainerDied","Data":"bf0bb7f4688dccd19c9c72930f322ee8ae379174bcb1821920214f8d9710da26"} Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.230945 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf0bb7f4688dccd19c9c72930f322ee8ae379174bcb1821920214f8d9710da26" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.232998 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" event={"ID":"219b9d1c-8e83-4f19-9163-ebb0ef8df490","Type":"ContainerStarted","Data":"c0bb05b0860d33025afaa2609668f2d6dbd274f9a1c8c8e0727fd082c1e893d4"} Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.233434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.235226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-0647-account-create-update-wpz7z" event={"ID":"43962c3a-61ca-4e7c-b56c-d9eefcaddeee","Type":"ContainerDied","Data":"ad18c91e8ae7676fb1c5f1232e558f823716fb086530ea7a5543c3f7309823c8"} Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.235266 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad18c91e8ae7676fb1c5f1232e558f823716fb086530ea7a5543c3f7309823c8" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.235353 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-0647-account-create-update-wpz7z" Mar 10 15:26:32 crc kubenswrapper[4743]: I0310 15:26:32.264350 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" podStartSLOduration=4.264328774 podStartE2EDuration="4.264328774s" podCreationTimestamp="2026-03-10 15:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:32.260338851 +0000 UTC m=+1256.967153609" watchObservedRunningTime="2026-03-10 15:26:32.264328774 +0000 UTC m=+1256.971143522" Mar 10 15:26:34 crc kubenswrapper[4743]: I0310 15:26:34.901284 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.076984 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45b11c74-122b-4da8-a2e2-040fab849d4b-operator-scripts\") pod \"45b11c74-122b-4da8-a2e2-040fab849d4b\" (UID: \"45b11c74-122b-4da8-a2e2-040fab849d4b\") " Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.077059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9298q\" (UniqueName: \"kubernetes.io/projected/45b11c74-122b-4da8-a2e2-040fab849d4b-kube-api-access-9298q\") pod \"45b11c74-122b-4da8-a2e2-040fab849d4b\" (UID: \"45b11c74-122b-4da8-a2e2-040fab849d4b\") " Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.078289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b11c74-122b-4da8-a2e2-040fab849d4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45b11c74-122b-4da8-a2e2-040fab849d4b" (UID: "45b11c74-122b-4da8-a2e2-040fab849d4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.081144 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b11c74-122b-4da8-a2e2-040fab849d4b-kube-api-access-9298q" (OuterVolumeSpecName: "kube-api-access-9298q") pod "45b11c74-122b-4da8-a2e2-040fab849d4b" (UID: "45b11c74-122b-4da8-a2e2-040fab849d4b"). InnerVolumeSpecName "kube-api-access-9298q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.179906 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45b11c74-122b-4da8-a2e2-040fab849d4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.179935 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9298q\" (UniqueName: \"kubernetes.io/projected/45b11c74-122b-4da8-a2e2-040fab849d4b-kube-api-access-9298q\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.260733 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7mqd4" event={"ID":"49e25cc1-8829-47fa-8a68-9968a7ba8e75","Type":"ContainerStarted","Data":"cda9dd23595e75df463f0b79861831f21628bf5b35e2fa35e8042fb3d2251ac7"} Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.262749 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1bfd-account-create-update-5nmxw" event={"ID":"45b11c74-122b-4da8-a2e2-040fab849d4b","Type":"ContainerDied","Data":"2b3a7dd622e3648c81cb3596e4afa8cd3c8fd640e00f56bd56c1545de64cc9ae"} Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.262785 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b3a7dd622e3648c81cb3596e4afa8cd3c8fd640e00f56bd56c1545de64cc9ae" Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.262850 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1bfd-account-create-update-5nmxw" Mar 10 15:26:35 crc kubenswrapper[4743]: I0310 15:26:35.958588 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7mqd4" podStartSLOduration=3.002861705 podStartE2EDuration="8.958565042s" podCreationTimestamp="2026-03-10 15:26:27 +0000 UTC" firstStartedPulling="2026-03-10 15:26:28.938652214 +0000 UTC m=+1253.645466962" lastFinishedPulling="2026-03-10 15:26:34.894355551 +0000 UTC m=+1259.601170299" observedRunningTime="2026-03-10 15:26:35.293944849 +0000 UTC m=+1260.000759647" watchObservedRunningTime="2026-03-10 15:26:35.958565042 +0000 UTC m=+1260.665379800" Mar 10 15:26:38 crc kubenswrapper[4743]: I0310 15:26:38.286564 4743 generic.go:334] "Generic (PLEG): container finished" podID="49e25cc1-8829-47fa-8a68-9968a7ba8e75" containerID="cda9dd23595e75df463f0b79861831f21628bf5b35e2fa35e8042fb3d2251ac7" exitCode=0 Mar 10 15:26:38 crc kubenswrapper[4743]: I0310 15:26:38.286714 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7mqd4" event={"ID":"49e25cc1-8829-47fa-8a68-9968a7ba8e75","Type":"ContainerDied","Data":"cda9dd23595e75df463f0b79861831f21628bf5b35e2fa35e8042fb3d2251ac7"} Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.093986 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.153249 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5js7q"] Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.153482 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" podUID="016d978b-7540-425c-8328-75d43cf9f042" containerName="dnsmasq-dns" containerID="cri-o://4190571d4cd4869dcbfd420bfa52f9a27dc08138e66d9e3a4ad68baa16ae9bfe" gracePeriod=10 Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.299463 4743 generic.go:334] "Generic (PLEG): container finished" podID="016d978b-7540-425c-8328-75d43cf9f042" containerID="4190571d4cd4869dcbfd420bfa52f9a27dc08138e66d9e3a4ad68baa16ae9bfe" exitCode=0 Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.299971 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" event={"ID":"016d978b-7540-425c-8328-75d43cf9f042","Type":"ContainerDied","Data":"4190571d4cd4869dcbfd420bfa52f9a27dc08138e66d9e3a4ad68baa16ae9bfe"} Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.813828 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.820156 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.965508 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-nb\") pod \"016d978b-7540-425c-8328-75d43cf9f042\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.965620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-sb\") pod \"016d978b-7540-425c-8328-75d43cf9f042\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.966383 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-config-data\") pod \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.966722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-dns-svc\") pod \"016d978b-7540-425c-8328-75d43cf9f042\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.966782 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-config\") pod \"016d978b-7540-425c-8328-75d43cf9f042\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.966808 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-combined-ca-bundle\") pod \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.966856 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x59c\" (UniqueName: \"kubernetes.io/projected/016d978b-7540-425c-8328-75d43cf9f042-kube-api-access-6x59c\") pod \"016d978b-7540-425c-8328-75d43cf9f042\" (UID: \"016d978b-7540-425c-8328-75d43cf9f042\") " Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.966889 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dm4h\" (UniqueName: \"kubernetes.io/projected/49e25cc1-8829-47fa-8a68-9968a7ba8e75-kube-api-access-5dm4h\") pod \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\" (UID: \"49e25cc1-8829-47fa-8a68-9968a7ba8e75\") " Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.971025 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e25cc1-8829-47fa-8a68-9968a7ba8e75-kube-api-access-5dm4h" (OuterVolumeSpecName: "kube-api-access-5dm4h") pod "49e25cc1-8829-47fa-8a68-9968a7ba8e75" (UID: "49e25cc1-8829-47fa-8a68-9968a7ba8e75"). InnerVolumeSpecName "kube-api-access-5dm4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:39 crc kubenswrapper[4743]: I0310 15:26:39.974171 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016d978b-7540-425c-8328-75d43cf9f042-kube-api-access-6x59c" (OuterVolumeSpecName: "kube-api-access-6x59c") pod "016d978b-7540-425c-8328-75d43cf9f042" (UID: "016d978b-7540-425c-8328-75d43cf9f042"). InnerVolumeSpecName "kube-api-access-6x59c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.001990 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49e25cc1-8829-47fa-8a68-9968a7ba8e75" (UID: "49e25cc1-8829-47fa-8a68-9968a7ba8e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.023429 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-config" (OuterVolumeSpecName: "config") pod "016d978b-7540-425c-8328-75d43cf9f042" (UID: "016d978b-7540-425c-8328-75d43cf9f042"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.030189 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-config-data" (OuterVolumeSpecName: "config-data") pod "49e25cc1-8829-47fa-8a68-9968a7ba8e75" (UID: "49e25cc1-8829-47fa-8a68-9968a7ba8e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.031421 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "016d978b-7540-425c-8328-75d43cf9f042" (UID: "016d978b-7540-425c-8328-75d43cf9f042"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.040629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "016d978b-7540-425c-8328-75d43cf9f042" (UID: "016d978b-7540-425c-8328-75d43cf9f042"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.053083 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "016d978b-7540-425c-8328-75d43cf9f042" (UID: "016d978b-7540-425c-8328-75d43cf9f042"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.068759 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.068804 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.068834 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x59c\" (UniqueName: \"kubernetes.io/projected/016d978b-7540-425c-8328-75d43cf9f042-kube-api-access-6x59c\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.068845 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dm4h\" (UniqueName: \"kubernetes.io/projected/49e25cc1-8829-47fa-8a68-9968a7ba8e75-kube-api-access-5dm4h\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.068853 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.068862 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.068871 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e25cc1-8829-47fa-8a68-9968a7ba8e75-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.068880 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016d978b-7540-425c-8328-75d43cf9f042-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.311611 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" event={"ID":"016d978b-7540-425c-8328-75d43cf9f042","Type":"ContainerDied","Data":"ebccc5eb5d3459e1341c23c5f9ff3ee45c2f85eb7d5275702d70ea7e26210e46"} Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.311671 4743 scope.go:117] "RemoveContainer" containerID="4190571d4cd4869dcbfd420bfa52f9a27dc08138e66d9e3a4ad68baa16ae9bfe" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.311672 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-5js7q" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.315275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7mqd4" event={"ID":"49e25cc1-8829-47fa-8a68-9968a7ba8e75","Type":"ContainerDied","Data":"daf3ee9ebf9f262fd392c04d78729c08f885d5eb8ba90f725afa3a0abba09c62"} Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.315333 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daf3ee9ebf9f262fd392c04d78729c08f885d5eb8ba90f725afa3a0abba09c62" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.315356 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7mqd4" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.360143 4743 scope.go:117] "RemoveContainer" containerID="54a0e0020ddd816943bdd70bab2e6d0987c76359b2a5a1fca94a254609a4b372" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.368550 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5js7q"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.378510 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-5js7q"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.609876 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dmjpg"] Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610287 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700303a1-61db-4669-9dba-31bba76aa8a5" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610305 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="700303a1-61db-4669-9dba-31bba76aa8a5" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610319 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1124ae-ce38-4379-b981-cb509dc25ce7" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610326 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1124ae-ce38-4379-b981-cb509dc25ce7" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610336 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e25cc1-8829-47fa-8a68-9968a7ba8e75" containerName="keystone-db-sync" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610345 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e25cc1-8829-47fa-8a68-9968a7ba8e75" containerName="keystone-db-sync" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610356 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5505a3d8-7971-4ab1-9d50-def772d62890" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610362 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5505a3d8-7971-4ab1-9d50-def772d62890" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610372 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016d978b-7540-425c-8328-75d43cf9f042" containerName="dnsmasq-dns" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610378 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="016d978b-7540-425c-8328-75d43cf9f042" containerName="dnsmasq-dns" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610388 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fe554d-63d8-4ba6-948b-d8658db57faa" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610396 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fe554d-63d8-4ba6-948b-d8658db57faa" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610406 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12b745-718e-4ee6-9751-ba679e3e274f" containerName="init" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610412 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12b745-718e-4ee6-9751-ba679e3e274f" containerName="init" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610424 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b11c74-122b-4da8-a2e2-040fab849d4b" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610430 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b11c74-122b-4da8-a2e2-040fab849d4b" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610438 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c6da81-8a78-4383-aa08-580c5242a582" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610443 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c6da81-8a78-4383-aa08-580c5242a582" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610454 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016d978b-7540-425c-8328-75d43cf9f042" containerName="init" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610460 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="016d978b-7540-425c-8328-75d43cf9f042" containerName="init" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610473 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba22cc82-1acd-4b8b-b7a8-3d598262b490" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610479 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba22cc82-1acd-4b8b-b7a8-3d598262b490" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610496 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12b745-718e-4ee6-9751-ba679e3e274f" containerName="dnsmasq-dns" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610502 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12b745-718e-4ee6-9751-ba679e3e274f" containerName="dnsmasq-dns" Mar 10 15:26:40 crc kubenswrapper[4743]: E0310 15:26:40.610529 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43962c3a-61ca-4e7c-b56c-d9eefcaddeee" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610536 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="43962c3a-61ca-4e7c-b56c-d9eefcaddeee" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610689 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="43962c3a-61ca-4e7c-b56c-d9eefcaddeee" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610699 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b12b745-718e-4ee6-9751-ba679e3e274f" containerName="dnsmasq-dns" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610705 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="016d978b-7540-425c-8328-75d43cf9f042" containerName="dnsmasq-dns" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610714 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b11c74-122b-4da8-a2e2-040fab849d4b" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610721 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e25cc1-8829-47fa-8a68-9968a7ba8e75" containerName="keystone-db-sync" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610733 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c6da81-8a78-4383-aa08-580c5242a582" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610743 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5505a3d8-7971-4ab1-9d50-def772d62890" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610752 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba22cc82-1acd-4b8b-b7a8-3d598262b490" containerName="mariadb-account-create-update" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610759 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="700303a1-61db-4669-9dba-31bba76aa8a5" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610769 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6fe554d-63d8-4ba6-948b-d8658db57faa" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.610778 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1124ae-ce38-4379-b981-cb509dc25ce7" containerName="mariadb-database-create" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.611634 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.619099 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9r7st"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.620979 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.624968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.625053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.625278 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.625385 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.628188 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wwgbc" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.646295 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dmjpg"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.678472 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9r7st"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781023 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781098 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-scripts\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-config\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781335 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2jl9\" (UniqueName: \"kubernetes.io/projected/47682936-cbfb-43a0-8e2c-b0287f6b44ef-kube-api-access-x2jl9\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-combined-ca-bundle\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781481 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxzm\" (UniqueName: \"kubernetes.io/projected/b0a88d02-ab89-44be-94f8-758e2e2dd395-kube-api-access-7sxzm\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781586 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-fernet-keys\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-config-data\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781896 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-credential-keys\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.781940 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.821733 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bf9878dbc-v8nn7"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.823053 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.827501 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-spfz5" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.827560 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.827657 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.827705 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.873884 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bf9878dbc-v8nn7"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883419 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2jl9\" (UniqueName: \"kubernetes.io/projected/47682936-cbfb-43a0-8e2c-b0287f6b44ef-kube-api-access-x2jl9\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-combined-ca-bundle\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883532 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxzm\" (UniqueName: \"kubernetes.io/projected/b0a88d02-ab89-44be-94f8-758e2e2dd395-kube-api-access-7sxzm\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-fernet-keys\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883600 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-config-data\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-credential-keys\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-scripts\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.883875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-config\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.884852 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-config\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.891356 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.898550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.899841 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.900566 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.905967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-combined-ca-bundle\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.908036 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-credential-keys\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.912463 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-fernet-keys\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.914252 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-86f6w"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.914878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-config-data\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.915306 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.925221 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.925302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxzm\" (UniqueName: \"kubernetes.io/projected/b0a88d02-ab89-44be-94f8-758e2e2dd395-kube-api-access-7sxzm\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.925615 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.925919 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mcr6l" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.929802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-scripts\") pod \"keystone-bootstrap-9r7st\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.930168 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-86f6w"] Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.933477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2jl9\" (UniqueName: \"kubernetes.io/projected/47682936-cbfb-43a0-8e2c-b0287f6b44ef-kube-api-access-x2jl9\") pod \"dnsmasq-dns-bbf5cc879-dmjpg\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.939423 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.947290 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.986842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqd52\" (UniqueName: \"kubernetes.io/projected/744d5501-5ab3-4086-acc8-9f7a01ca8513-kube-api-access-mqd52\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.986906 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/744d5501-5ab3-4086-acc8-9f7a01ca8513-horizon-secret-key\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.986962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-scripts\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.986981 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744d5501-5ab3-4086-acc8-9f7a01ca8513-logs\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:40 crc kubenswrapper[4743]: I0310 15:26:40.987025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-config-data\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.046972 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.049088 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.054899 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.055095 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.085884 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4hbxs"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.087120 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-config-data\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-db-sync-config-data\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088589 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-config-data\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088605 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-scripts\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-combined-ca-bundle\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68a99e3b-6d76-485c-b284-5f275ba9bbef-etc-machine-id\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd5rg\" (UniqueName: \"kubernetes.io/projected/68a99e3b-6d76-485c-b284-5f275ba9bbef-kube-api-access-dd5rg\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqd52\" (UniqueName: \"kubernetes.io/projected/744d5501-5ab3-4086-acc8-9f7a01ca8513-kube-api-access-mqd52\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/744d5501-5ab3-4086-acc8-9f7a01ca8513-horizon-secret-key\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088788 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-scripts\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.088805 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744d5501-5ab3-4086-acc8-9f7a01ca8513-logs\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.089210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744d5501-5ab3-4086-acc8-9f7a01ca8513-logs\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.090843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-config-data\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.091436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-scripts\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.102323 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.102600 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2m64d" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.102711 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.112571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/744d5501-5ab3-4086-acc8-9f7a01ca8513-horizon-secret-key\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.122142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqd52\" (UniqueName: \"kubernetes.io/projected/744d5501-5ab3-4086-acc8-9f7a01ca8513-kube-api-access-mqd52\") pod \"horizon-bf9878dbc-v8nn7\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.133977 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-lgtjg"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.135449 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.137343 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.139610 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-nj79g" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.142664 4743 scope.go:117] "RemoveContainer" containerID="1a5e0b52b7d3edafb69cba70213fad247de19d0a18c942cbeeaef768a0c4c220" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.143694 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.169560 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4hbxs"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-config-data\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-scripts\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-combined-ca-bundle\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p5hm\" (UniqueName: \"kubernetes.io/projected/88a25471-c13d-434e-9f74-82de0cd19099-kube-api-access-6p5hm\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68a99e3b-6d76-485c-b284-5f275ba9bbef-etc-machine-id\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-combined-ca-bundle\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5rg\" (UniqueName: \"kubernetes.io/projected/68a99e3b-6d76-485c-b284-5f275ba9bbef-kube-api-access-dd5rg\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190600 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190640 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmzl\" (UniqueName: \"kubernetes.io/projected/482b3103-f6d6-410f-9106-b10ad1695c78-kube-api-access-hbmzl\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-log-httpd\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-config\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190738 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-scripts\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-run-httpd\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190772 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-config-data\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.190793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-db-sync-config-data\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.198298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68a99e3b-6d76-485c-b284-5f275ba9bbef-etc-machine-id\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.198532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-scripts\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.200310 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-db-sync-config-data\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.203925 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-config-data\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.208223 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-combined-ca-bundle\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.208298 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lgtjg"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.241919 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.258888 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-z94q2"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.265499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5rg\" (UniqueName: \"kubernetes.io/projected/68a99e3b-6d76-485c-b284-5f275ba9bbef-kube-api-access-dd5rg\") pod \"cinder-db-sync-86f6w\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.275831 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.281290 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.281559 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w2ql2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.293987 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmzl\" (UniqueName: \"kubernetes.io/projected/482b3103-f6d6-410f-9106-b10ad1695c78-kube-api-access-hbmzl\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-combined-ca-bundle\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294158 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-log-httpd\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-config\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpq2h\" (UniqueName: \"kubernetes.io/projected/45477f7e-f216-40fb-acdb-d7a1dbadba99-kube-api-access-rpq2h\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-scripts\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294284 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-run-httpd\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-config-data\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294340 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-job-config-data\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p5hm\" (UniqueName: \"kubernetes.io/projected/88a25471-c13d-434e-9f74-82de0cd19099-kube-api-access-6p5hm\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-combined-ca-bundle\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.294464 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-config-data\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.298744 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-run-httpd\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.299529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-log-httpd\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.301029 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-z94q2"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.353730 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.354477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-combined-ca-bundle\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.355553 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-86f6w" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.356030 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.357037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmzl\" (UniqueName: \"kubernetes.io/projected/482b3103-f6d6-410f-9106-b10ad1695c78-kube-api-access-hbmzl\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.357292 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-config-data\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.358476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-scripts\") pod \"ceilometer-0\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.359606 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-config\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.371303 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p5hm\" (UniqueName: \"kubernetes.io/projected/88a25471-c13d-434e-9f74-82de0cd19099-kube-api-access-6p5hm\") pod \"neutron-db-sync-4hbxs\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.401924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-job-config-data\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.402001 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-combined-ca-bundle\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.402131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-config-data\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.402192 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-db-sync-config-data\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.402249 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-combined-ca-bundle\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.402285 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-kube-api-access-jbppw\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.402334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpq2h\" (UniqueName: \"kubernetes.io/projected/45477f7e-f216-40fb-acdb-d7a1dbadba99-kube-api-access-rpq2h\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.425045 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-combined-ca-bundle\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.425405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-job-config-data\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.435492 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dmjpg"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.437604 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpq2h\" (UniqueName: \"kubernetes.io/projected/45477f7e-f216-40fb-acdb-d7a1dbadba99-kube-api-access-rpq2h\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.469654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-config-data\") pod \"manila-db-sync-lgtjg\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.472332 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.484786 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.495224 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.496988 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.498630 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lgtjg" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.500635 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.500879 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.501012 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.501163 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.503542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-combined-ca-bundle\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.503643 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-db-sync-config-data\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.503706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-kube-api-access-jbppw\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.506294 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wh995" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.529764 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-kube-api-access-jbppw\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.529846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-db-sync-config-data\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.534687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-combined-ca-bundle\") pod \"barbican-db-sync-z94q2\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.558829 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-z7zlr"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.560528 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.585290 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.628877 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-lc42v"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.630360 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.634934 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.635171 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.635316 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-79znn" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.637441 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z94q2" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.674604 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-z7zlr"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711459 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711479 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-logs\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711541 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711594 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-config-data\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvvcl\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-kube-api-access-bvvcl\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-ceph\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-config\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.711795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.718248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.718344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.718400 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sm6w\" (UniqueName: \"kubernetes.io/projected/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-kube-api-access-7sm6w\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.718425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-scripts\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.727983 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57cf99654f-fqjnm"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.751251 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.781551 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lc42v"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.814934 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57cf99654f-fqjnm"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.823471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-config-data\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.823555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvvcl\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-kube-api-access-bvvcl\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.823591 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-ceph\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.823616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-scripts\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.823643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef9806a-40c1-468d-92d8-70e92819f27b-logs\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829001 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4zxg\" (UniqueName: \"kubernetes.io/projected/8ef9806a-40c1-468d-92d8-70e92819f27b-kube-api-access-w4zxg\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-config\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829463 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829521 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sm6w\" (UniqueName: \"kubernetes.io/projected/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-kube-api-access-7sm6w\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-scripts\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-combined-ca-bundle\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829662 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-config-data\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829842 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-logs\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.829875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.831186 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.831191 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.833839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.834451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-logs\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.835424 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-config\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.835591 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.836307 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.838309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.848514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-scripts\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.849855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.851908 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-config-data\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.855734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.858509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-ceph\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.880846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.883752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvvcl\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-kube-api-access-bvvcl\") pod \"glance-default-external-api-0\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.885304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sm6w\" (UniqueName: \"kubernetes.io/projected/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-kube-api-access-7sm6w\") pod \"dnsmasq-dns-56df8fb6b7-z7zlr\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.896963 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.898633 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.903459 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.903783 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab83d2d-0f12-4be4-9b5e-41594df332ed-logs\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-combined-ca-bundle\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gplx\" (UniqueName: \"kubernetes.io/projected/fab83d2d-0f12-4be4-9b5e-41594df332ed-kube-api-access-5gplx\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933723 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-config-data\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fab83d2d-0f12-4be4-9b5e-41594df332ed-horizon-secret-key\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933888 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-scripts\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933909 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-config-data\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef9806a-40c1-468d-92d8-70e92819f27b-logs\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933953 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4zxg\" (UniqueName: \"kubernetes.io/projected/8ef9806a-40c1-468d-92d8-70e92819f27b-kube-api-access-w4zxg\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.933980 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-scripts\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.942599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef9806a-40c1-468d-92d8-70e92819f27b-logs\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.943625 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-config-data\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.943923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-combined-ca-bundle\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.968785 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4zxg\" (UniqueName: \"kubernetes.io/projected/8ef9806a-40c1-468d-92d8-70e92819f27b-kube-api-access-w4zxg\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.979116 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-scripts\") pod \"placement-db-sync-lc42v\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:41 crc kubenswrapper[4743]: I0310 15:26:41.982426 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.006029 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016d978b-7540-425c-8328-75d43cf9f042" path="/var/lib/kubelet/pods/016d978b-7540-425c-8328-75d43cf9f042/volumes" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.018722 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.025669 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fab83d2d-0f12-4be4-9b5e-41594df332ed-horizon-secret-key\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-logs\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pktb2\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-kube-api-access-pktb2\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-config-data\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038630 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038658 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-scripts\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038710 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab83d2d-0f12-4be4-9b5e-41594df332ed-logs\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038864 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-ceph\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.038889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gplx\" (UniqueName: \"kubernetes.io/projected/fab83d2d-0f12-4be4-9b5e-41594df332ed-kube-api-access-5gplx\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.040685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-config-data\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.042310 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab83d2d-0f12-4be4-9b5e-41594df332ed-logs\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.044068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-scripts\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.053659 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lc42v" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.062261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fab83d2d-0f12-4be4-9b5e-41594df332ed-horizon-secret-key\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.096902 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gplx\" (UniqueName: \"kubernetes.io/projected/fab83d2d-0f12-4be4-9b5e-41594df332ed-kube-api-access-5gplx\") pod \"horizon-57cf99654f-fqjnm\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.128978 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.140680 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.140875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.140935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.140967 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-ceph\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.141018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.141035 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-logs\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.141060 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pktb2\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-kube-api-access-pktb2\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.141083 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.141107 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.141587 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.143951 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.145034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-logs\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.146476 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9r7st"] Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.153412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.156548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.176561 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.177314 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pktb2\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-kube-api-access-pktb2\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.181365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.183600 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-ceph\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.185765 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.224833 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.501036 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9r7st" event={"ID":"b0a88d02-ab89-44be-94f8-758e2e2dd395","Type":"ContainerStarted","Data":"490cbda36b4992b2861ba5f26bc7171a08da9fab2002ecef90e7ca0300334c20"} Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.546025 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dmjpg"] Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.558306 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bf9878dbc-v8nn7"] Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.656111 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4hbxs"] Mar 10 15:26:42 crc kubenswrapper[4743]: I0310 15:26:42.684968 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-86f6w"] Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.032348 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:26:43 crc kubenswrapper[4743]: W0310 15:26:43.042012 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod482b3103_f6d6_410f_9106_b10ad1695c78.slice/crio-1acb1cf3d6c2af05dee47b1149592a0ec38eecf3d1d1997f3e2f3231ab5e63c0 WatchSource:0}: Error finding container 1acb1cf3d6c2af05dee47b1149592a0ec38eecf3d1d1997f3e2f3231ab5e63c0: Status 404 returned error can't find the container with id 1acb1cf3d6c2af05dee47b1149592a0ec38eecf3d1d1997f3e2f3231ab5e63c0 Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.149313 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lgtjg"] Mar 10 15:26:43 crc kubenswrapper[4743]: W0310 15:26:43.170519 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45477f7e_f216_40fb_acdb_d7a1dbadba99.slice/crio-f3cf19d30864c06e7978df8f3c591e44269951e51d874a164ee596b12a048ab8 WatchSource:0}: Error finding container f3cf19d30864c06e7978df8f3c591e44269951e51d874a164ee596b12a048ab8: Status 404 returned error can't find the container with id f3cf19d30864c06e7978df8f3c591e44269951e51d874a164ee596b12a048ab8 Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.532348 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4hbxs" event={"ID":"88a25471-c13d-434e-9f74-82de0cd19099","Type":"ContainerStarted","Data":"d3dc28fde989f285e2d3ddd3cb9f48e1356b0d71457185b391360afe8cbe343f"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.532394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4hbxs" event={"ID":"88a25471-c13d-434e-9f74-82de0cd19099","Type":"ContainerStarted","Data":"ebccf062f0f8a734dcf3f355ca44fbc66335459db95e8454942408e8b2370bed"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.548549 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-z94q2"] Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.548636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" event={"ID":"47682936-cbfb-43a0-8e2c-b0287f6b44ef","Type":"ContainerDied","Data":"1d7953b9e3c323efdcced331de321ad691de9cb8cc10e834edd7cf59c2f38b32"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.548576 4743 generic.go:334] "Generic (PLEG): container finished" podID="47682936-cbfb-43a0-8e2c-b0287f6b44ef" containerID="1d7953b9e3c323efdcced331de321ad691de9cb8cc10e834edd7cf59c2f38b32" exitCode=0 Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.548755 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" event={"ID":"47682936-cbfb-43a0-8e2c-b0287f6b44ef","Type":"ContainerStarted","Data":"e0a8d0f7f8242fcd090961127d4356a3cd30119bbb0f3faf72ce14370834efec"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.562537 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"482b3103-f6d6-410f-9106-b10ad1695c78","Type":"ContainerStarted","Data":"1acb1cf3d6c2af05dee47b1149592a0ec38eecf3d1d1997f3e2f3231ab5e63c0"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.579797 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lc42v"] Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.582149 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4hbxs" podStartSLOduration=3.582131768 podStartE2EDuration="3.582131768s" podCreationTimestamp="2026-03-10 15:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:43.553944859 +0000 UTC m=+1268.260759607" watchObservedRunningTime="2026-03-10 15:26:43.582131768 +0000 UTC m=+1268.288946516" Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.592110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lgtjg" event={"ID":"45477f7e-f216-40fb-acdb-d7a1dbadba99","Type":"ContainerStarted","Data":"f3cf19d30864c06e7978df8f3c591e44269951e51d874a164ee596b12a048ab8"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.602233 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57cf99654f-fqjnm"] Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.617416 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bf9878dbc-v8nn7" event={"ID":"744d5501-5ab3-4086-acc8-9f7a01ca8513","Type":"ContainerStarted","Data":"e098227b669f1414a0778b8e70ecfefa42d13d98e0c97833041b8855e5b87e19"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.640958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9r7st" event={"ID":"b0a88d02-ab89-44be-94f8-758e2e2dd395","Type":"ContainerStarted","Data":"1d8697df409a13c9ff4e89cae95eb036e351e739db865e100680a3a6b05918d7"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.666322 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-86f6w" event={"ID":"68a99e3b-6d76-485c-b284-5f275ba9bbef","Type":"ContainerStarted","Data":"cef2e7c147540f7b19254f5416cdd113250b631982a8a3fab85e9d38670a1a08"} Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.680526 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.690043 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-z7zlr"] Mar 10 15:26:43 crc kubenswrapper[4743]: I0310 15:26:43.700705 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9r7st" podStartSLOduration=3.700691986 podStartE2EDuration="3.700691986s" podCreationTimestamp="2026-03-10 15:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:43.700294234 +0000 UTC m=+1268.407108992" watchObservedRunningTime="2026-03-10 15:26:43.700691986 +0000 UTC m=+1268.407506734" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.118388 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.227415 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-config\") pod \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.227525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-svc\") pod \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.227594 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2jl9\" (UniqueName: \"kubernetes.io/projected/47682936-cbfb-43a0-8e2c-b0287f6b44ef-kube-api-access-x2jl9\") pod \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.227641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-nb\") pod \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.227685 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-swift-storage-0\") pod \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.227869 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-sb\") pod \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\" (UID: \"47682936-cbfb-43a0-8e2c-b0287f6b44ef\") " Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.236903 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47682936-cbfb-43a0-8e2c-b0287f6b44ef-kube-api-access-x2jl9" (OuterVolumeSpecName: "kube-api-access-x2jl9") pod "47682936-cbfb-43a0-8e2c-b0287f6b44ef" (UID: "47682936-cbfb-43a0-8e2c-b0287f6b44ef"). InnerVolumeSpecName "kube-api-access-x2jl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.275861 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "47682936-cbfb-43a0-8e2c-b0287f6b44ef" (UID: "47682936-cbfb-43a0-8e2c-b0287f6b44ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.281728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "47682936-cbfb-43a0-8e2c-b0287f6b44ef" (UID: "47682936-cbfb-43a0-8e2c-b0287f6b44ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.289802 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "47682936-cbfb-43a0-8e2c-b0287f6b44ef" (UID: "47682936-cbfb-43a0-8e2c-b0287f6b44ef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.290022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-config" (OuterVolumeSpecName: "config") pod "47682936-cbfb-43a0-8e2c-b0287f6b44ef" (UID: "47682936-cbfb-43a0-8e2c-b0287f6b44ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.309297 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "47682936-cbfb-43a0-8e2c-b0287f6b44ef" (UID: "47682936-cbfb-43a0-8e2c-b0287f6b44ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.330893 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.330926 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.330938 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2jl9\" (UniqueName: \"kubernetes.io/projected/47682936-cbfb-43a0-8e2c-b0287f6b44ef-kube-api-access-x2jl9\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.330951 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.330961 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.330970 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47682936-cbfb-43a0-8e2c-b0287f6b44ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.335696 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:44 crc kubenswrapper[4743]: W0310 15:26:44.366710 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95bd9860_cffc_40c7_8edc_7e148f9d0af4.slice/crio-c13a8203e8bfa8f954e4314f786c0bb3b4fda8c1f4054ca9b73cecd93b20ce7a WatchSource:0}: Error finding container c13a8203e8bfa8f954e4314f786c0bb3b4fda8c1f4054ca9b73cecd93b20ce7a: Status 404 returned error can't find the container with id c13a8203e8bfa8f954e4314f786c0bb3b4fda8c1f4054ca9b73cecd93b20ce7a Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.733958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95bd9860-cffc-40c7-8edc-7e148f9d0af4","Type":"ContainerStarted","Data":"c13a8203e8bfa8f954e4314f786c0bb3b4fda8c1f4054ca9b73cecd93b20ce7a"} Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.749593 4743 generic.go:334] "Generic (PLEG): container finished" podID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerID="69cdfd9df798610b35a6af430930d201b5ce3a778dcd0954f947237039177ee5" exitCode=0 Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.751367 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" event={"ID":"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43","Type":"ContainerDied","Data":"69cdfd9df798610b35a6af430930d201b5ce3a778dcd0954f947237039177ee5"} Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.757535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" event={"ID":"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43","Type":"ContainerStarted","Data":"0219e3f6ea9e5f7f3544453d460290940faac1e7503420ed2e8f725cc1527a6a"} Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.784350 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.798636 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.799793 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-dmjpg" event={"ID":"47682936-cbfb-43a0-8e2c-b0287f6b44ef","Type":"ContainerDied","Data":"e0a8d0f7f8242fcd090961127d4356a3cd30119bbb0f3faf72ce14370834efec"} Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.799858 4743 scope.go:117] "RemoveContainer" containerID="1d7953b9e3c323efdcced331de321ad691de9cb8cc10e834edd7cf59c2f38b32" Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.841498 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z94q2" event={"ID":"d7ff1eff-8355-40a9-b02d-cfb47e08bb46","Type":"ContainerStarted","Data":"fa72c4f309f4ed7d545948aacbd01ccc4753a9bcd62a5003c96855587feb5697"} Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.873785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lc42v" event={"ID":"8ef9806a-40c1-468d-92d8-70e92819f27b","Type":"ContainerStarted","Data":"d4e95baf3e35022aed87eb953f753ec5ccaf4742b01d87c1240b04094a621235"} Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.892960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"490b9073-0616-4d54-8383-f9eeb519c605","Type":"ContainerStarted","Data":"90a7514e5adbd87cd547762091eee9607225a35cc1c62af9107c5235458f6f91"} Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.899856 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bf9878dbc-v8nn7"] Mar 10 15:26:44 crc kubenswrapper[4743]: I0310 15:26:44.969596 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57cf99654f-fqjnm" event={"ID":"fab83d2d-0f12-4be4-9b5e-41594df332ed","Type":"ContainerStarted","Data":"e1b5343b4dca3ceb1633a32273b87618f7240374e1988adb7722fab897b95a81"} Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:44.990960 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c58bbcd67-dxpcc"] Mar 10 15:26:45 crc kubenswrapper[4743]: E0310 15:26:44.991510 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47682936-cbfb-43a0-8e2c-b0287f6b44ef" containerName="init" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:44.991547 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="47682936-cbfb-43a0-8e2c-b0287f6b44ef" containerName="init" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:44.991779 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="47682936-cbfb-43a0-8e2c-b0287f6b44ef" containerName="init" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:44.992985 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.048878 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.077433 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c58bbcd67-dxpcc"] Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.078281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgz56\" (UniqueName: \"kubernetes.io/projected/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-kube-api-access-vgz56\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.078361 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-horizon-secret-key\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.078576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-config-data\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.078625 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-scripts\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.078706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-logs\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.135628 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.138622 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dmjpg"] Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.179682 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dmjpg"] Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.192225 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-config-data\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.192294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-scripts\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.192342 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-logs\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.192441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgz56\" (UniqueName: \"kubernetes.io/projected/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-kube-api-access-vgz56\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.192487 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-horizon-secret-key\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.193636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-scripts\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.194839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-config-data\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.195151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-logs\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.201212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-horizon-secret-key\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.217080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgz56\" (UniqueName: \"kubernetes.io/projected/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-kube-api-access-vgz56\") pod \"horizon-7c58bbcd67-dxpcc\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.382226 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:26:45 crc kubenswrapper[4743]: I0310 15:26:45.951451 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47682936-cbfb-43a0-8e2c-b0287f6b44ef" path="/var/lib/kubelet/pods/47682936-cbfb-43a0-8e2c-b0287f6b44ef/volumes" Mar 10 15:26:46 crc kubenswrapper[4743]: I0310 15:26:46.010249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"490b9073-0616-4d54-8383-f9eeb519c605","Type":"ContainerStarted","Data":"5b8243bc446a56930e8c070f7aeb7f03280904b8fca0ad33289ec0971c5b3d99"} Mar 10 15:26:46 crc kubenswrapper[4743]: I0310 15:26:46.025721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" event={"ID":"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43","Type":"ContainerStarted","Data":"2cc3a28c5a1bba28c6a1fcb245a3a9a5bc8469f4463eda608baaa10ade1d0448"} Mar 10 15:26:46 crc kubenswrapper[4743]: I0310 15:26:46.026940 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:46 crc kubenswrapper[4743]: W0310 15:26:46.163044 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a5acc3b_0431_490e_b3c8_3b2ffa682f8b.slice/crio-ad362fe8c86aa9064ae7e4957010462f8ca217a0f84a3acb8a2c8792e1a1a26b WatchSource:0}: Error finding container ad362fe8c86aa9064ae7e4957010462f8ca217a0f84a3acb8a2c8792e1a1a26b: Status 404 returned error can't find the container with id ad362fe8c86aa9064ae7e4957010462f8ca217a0f84a3acb8a2c8792e1a1a26b Mar 10 15:26:46 crc kubenswrapper[4743]: I0310 15:26:46.218877 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c58bbcd67-dxpcc"] Mar 10 15:26:46 crc kubenswrapper[4743]: I0310 15:26:46.265738 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" podStartSLOduration=5.265712892 podStartE2EDuration="5.265712892s" podCreationTimestamp="2026-03-10 15:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:46.19998329 +0000 UTC m=+1270.906798038" watchObservedRunningTime="2026-03-10 15:26:46.265712892 +0000 UTC m=+1270.972527650" Mar 10 15:26:47 crc kubenswrapper[4743]: I0310 15:26:47.052499 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c58bbcd67-dxpcc" event={"ID":"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b","Type":"ContainerStarted","Data":"ad362fe8c86aa9064ae7e4957010462f8ca217a0f84a3acb8a2c8792e1a1a26b"} Mar 10 15:26:47 crc kubenswrapper[4743]: I0310 15:26:47.055947 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95bd9860-cffc-40c7-8edc-7e148f9d0af4","Type":"ContainerStarted","Data":"625b14be5b44ebeb352f02f37ff237f071c8ed12f0b87f1e6c062da83d05743e"} Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.068197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95bd9860-cffc-40c7-8edc-7e148f9d0af4","Type":"ContainerStarted","Data":"f006f53a8dbbc19008d420e0abfdcdbc2068e3a5979cca2ea1721774a8e2c209"} Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.068350 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerName="glance-log" containerID="cri-o://625b14be5b44ebeb352f02f37ff237f071c8ed12f0b87f1e6c062da83d05743e" gracePeriod=30 Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.068659 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerName="glance-httpd" containerID="cri-o://f006f53a8dbbc19008d420e0abfdcdbc2068e3a5979cca2ea1721774a8e2c209" gracePeriod=30 Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.078487 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"490b9073-0616-4d54-8383-f9eeb519c605","Type":"ContainerStarted","Data":"4950466aabda7b273bc607f3f199f00342d0a413584aca09688ce8ad8c76fa59"} Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.078612 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="490b9073-0616-4d54-8383-f9eeb519c605" containerName="glance-log" containerID="cri-o://5b8243bc446a56930e8c070f7aeb7f03280904b8fca0ad33289ec0971c5b3d99" gracePeriod=30 Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.078676 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="490b9073-0616-4d54-8383-f9eeb519c605" containerName="glance-httpd" containerID="cri-o://4950466aabda7b273bc607f3f199f00342d0a413584aca09688ce8ad8c76fa59" gracePeriod=30 Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.083256 4743 generic.go:334] "Generic (PLEG): container finished" podID="b0a88d02-ab89-44be-94f8-758e2e2dd395" containerID="1d8697df409a13c9ff4e89cae95eb036e351e739db865e100680a3a6b05918d7" exitCode=0 Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.083302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9r7st" event={"ID":"b0a88d02-ab89-44be-94f8-758e2e2dd395","Type":"ContainerDied","Data":"1d8697df409a13c9ff4e89cae95eb036e351e739db865e100680a3a6b05918d7"} Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.101007 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.100979451 podStartE2EDuration="7.100979451s" podCreationTimestamp="2026-03-10 15:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:48.088051655 +0000 UTC m=+1272.794866463" watchObservedRunningTime="2026-03-10 15:26:48.100979451 +0000 UTC m=+1272.807794199" Mar 10 15:26:48 crc kubenswrapper[4743]: I0310 15:26:48.141089 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.141068747 podStartE2EDuration="7.141068747s" podCreationTimestamp="2026-03-10 15:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:48.139931714 +0000 UTC m=+1272.846746452" watchObservedRunningTime="2026-03-10 15:26:48.141068747 +0000 UTC m=+1272.847883485" Mar 10 15:26:49 crc kubenswrapper[4743]: I0310 15:26:49.098946 4743 generic.go:334] "Generic (PLEG): container finished" podID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerID="f006f53a8dbbc19008d420e0abfdcdbc2068e3a5979cca2ea1721774a8e2c209" exitCode=0 Mar 10 15:26:49 crc kubenswrapper[4743]: I0310 15:26:49.099236 4743 generic.go:334] "Generic (PLEG): container finished" podID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerID="625b14be5b44ebeb352f02f37ff237f071c8ed12f0b87f1e6c062da83d05743e" exitCode=143 Mar 10 15:26:49 crc kubenswrapper[4743]: I0310 15:26:49.099092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95bd9860-cffc-40c7-8edc-7e148f9d0af4","Type":"ContainerDied","Data":"f006f53a8dbbc19008d420e0abfdcdbc2068e3a5979cca2ea1721774a8e2c209"} Mar 10 15:26:49 crc kubenswrapper[4743]: I0310 15:26:49.099351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95bd9860-cffc-40c7-8edc-7e148f9d0af4","Type":"ContainerDied","Data":"625b14be5b44ebeb352f02f37ff237f071c8ed12f0b87f1e6c062da83d05743e"} Mar 10 15:26:49 crc kubenswrapper[4743]: I0310 15:26:49.102345 4743 generic.go:334] "Generic (PLEG): container finished" podID="490b9073-0616-4d54-8383-f9eeb519c605" containerID="4950466aabda7b273bc607f3f199f00342d0a413584aca09688ce8ad8c76fa59" exitCode=0 Mar 10 15:26:49 crc kubenswrapper[4743]: I0310 15:26:49.102378 4743 generic.go:334] "Generic (PLEG): container finished" podID="490b9073-0616-4d54-8383-f9eeb519c605" containerID="5b8243bc446a56930e8c070f7aeb7f03280904b8fca0ad33289ec0971c5b3d99" exitCode=143 Mar 10 15:26:49 crc kubenswrapper[4743]: I0310 15:26:49.102405 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"490b9073-0616-4d54-8383-f9eeb519c605","Type":"ContainerDied","Data":"4950466aabda7b273bc607f3f199f00342d0a413584aca09688ce8ad8c76fa59"} Mar 10 15:26:49 crc kubenswrapper[4743]: I0310 15:26:49.102435 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"490b9073-0616-4d54-8383-f9eeb519c605","Type":"ContainerDied","Data":"5b8243bc446a56930e8c070f7aeb7f03280904b8fca0ad33289ec0971c5b3d99"} Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.324832 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57cf99654f-fqjnm"] Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.372874 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7954db6464-ns5cf"] Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.374305 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.379187 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.395922 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7954db6464-ns5cf"] Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.448931 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c58bbcd67-dxpcc"] Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.488072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-combined-ca-bundle\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.488123 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-secret-key\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.488151 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0001988-feba-4afe-9068-071af12a6fd7-logs\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.488275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-config-data\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.488304 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-scripts\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.488346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xssfx\" (UniqueName: \"kubernetes.io/projected/c0001988-feba-4afe-9068-071af12a6fd7-kube-api-access-xssfx\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.488497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-tls-certs\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.509242 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-958fd895b-mxn2t"] Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.510766 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.537282 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-958fd895b-mxn2t"] Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.590675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xssfx\" (UniqueName: \"kubernetes.io/projected/c0001988-feba-4afe-9068-071af12a6fd7-kube-api-access-xssfx\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.590753 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-logs\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.590787 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-tls-certs\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.590847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-combined-ca-bundle\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.590878 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-secret-key\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.590904 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0001988-feba-4afe-9068-071af12a6fd7-logs\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.590965 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-combined-ca-bundle\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.591020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-horizon-secret-key\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.591064 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-horizon-tls-certs\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.591092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-config-data\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.591118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-config-data\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.591160 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-scripts\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.591186 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-scripts\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.591208 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smk6\" (UniqueName: \"kubernetes.io/projected/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-kube-api-access-5smk6\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.594210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-config-data\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.594646 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-scripts\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.595051 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0001988-feba-4afe-9068-071af12a6fd7-logs\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.600453 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-tls-certs\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.604883 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-combined-ca-bundle\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.613789 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-secret-key\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.619487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xssfx\" (UniqueName: \"kubernetes.io/projected/c0001988-feba-4afe-9068-071af12a6fd7-kube-api-access-xssfx\") pod \"horizon-7954db6464-ns5cf\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.693376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smk6\" (UniqueName: \"kubernetes.io/projected/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-kube-api-access-5smk6\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.693449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-scripts\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.693515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-logs\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.693618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-combined-ca-bundle\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.693676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-horizon-secret-key\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.693722 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-horizon-tls-certs\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.693749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-config-data\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.696217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-scripts\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.697138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-config-data\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.697719 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-logs\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.698523 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-combined-ca-bundle\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.700762 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-horizon-tls-certs\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.706571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-horizon-secret-key\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.710940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smk6\" (UniqueName: \"kubernetes.io/projected/cccf05c8-d4e8-4a1d-912f-5f4a37440ac7-kube-api-access-5smk6\") pod \"horizon-958fd895b-mxn2t\" (UID: \"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7\") " pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.745486 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:26:50 crc kubenswrapper[4743]: I0310 15:26:50.828923 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:26:52 crc kubenswrapper[4743]: I0310 15:26:52.029079 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:26:52 crc kubenswrapper[4743]: I0310 15:26:52.097374 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-dmqvp"] Mar 10 15:26:52 crc kubenswrapper[4743]: I0310 15:26:52.097685 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="dnsmasq-dns" containerID="cri-o://c0bb05b0860d33025afaa2609668f2d6dbd274f9a1c8c8e0727fd082c1e893d4" gracePeriod=10 Mar 10 15:26:53 crc kubenswrapper[4743]: I0310 15:26:53.153402 4743 generic.go:334] "Generic (PLEG): container finished" podID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerID="c0bb05b0860d33025afaa2609668f2d6dbd274f9a1c8c8e0727fd082c1e893d4" exitCode=0 Mar 10 15:26:53 crc kubenswrapper[4743]: I0310 15:26:53.153449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" event={"ID":"219b9d1c-8e83-4f19-9163-ebb0ef8df490","Type":"ContainerDied","Data":"c0bb05b0860d33025afaa2609668f2d6dbd274f9a1c8c8e0727fd082c1e893d4"} Mar 10 15:26:54 crc kubenswrapper[4743]: I0310 15:26:54.093362 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Mar 10 15:26:58 crc kubenswrapper[4743]: E0310 15:26:58.788803 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 10 15:26:58 crc kubenswrapper[4743]: E0310 15:26:58.789023 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4zxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-lc42v_openstack(8ef9806a-40c1-468d-92d8-70e92819f27b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:26:58 crc kubenswrapper[4743]: E0310 15:26:58.790245 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-lc42v" podUID="8ef9806a-40c1-468d-92d8-70e92819f27b" Mar 10 15:26:58 crc kubenswrapper[4743]: I0310 15:26:58.902363 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.017079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-scripts\") pod \"b0a88d02-ab89-44be-94f8-758e2e2dd395\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.017139 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-config-data\") pod \"b0a88d02-ab89-44be-94f8-758e2e2dd395\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.017251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-credential-keys\") pod \"b0a88d02-ab89-44be-94f8-758e2e2dd395\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.017319 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sxzm\" (UniqueName: \"kubernetes.io/projected/b0a88d02-ab89-44be-94f8-758e2e2dd395-kube-api-access-7sxzm\") pod \"b0a88d02-ab89-44be-94f8-758e2e2dd395\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.017409 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-combined-ca-bundle\") pod \"b0a88d02-ab89-44be-94f8-758e2e2dd395\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.017533 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-fernet-keys\") pod \"b0a88d02-ab89-44be-94f8-758e2e2dd395\" (UID: \"b0a88d02-ab89-44be-94f8-758e2e2dd395\") " Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.024079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0a88d02-ab89-44be-94f8-758e2e2dd395" (UID: "b0a88d02-ab89-44be-94f8-758e2e2dd395"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.024839 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a88d02-ab89-44be-94f8-758e2e2dd395-kube-api-access-7sxzm" (OuterVolumeSpecName: "kube-api-access-7sxzm") pod "b0a88d02-ab89-44be-94f8-758e2e2dd395" (UID: "b0a88d02-ab89-44be-94f8-758e2e2dd395"). InnerVolumeSpecName "kube-api-access-7sxzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.025048 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0a88d02-ab89-44be-94f8-758e2e2dd395" (UID: "b0a88d02-ab89-44be-94f8-758e2e2dd395"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.025085 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-scripts" (OuterVolumeSpecName: "scripts") pod "b0a88d02-ab89-44be-94f8-758e2e2dd395" (UID: "b0a88d02-ab89-44be-94f8-758e2e2dd395"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.048495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-config-data" (OuterVolumeSpecName: "config-data") pod "b0a88d02-ab89-44be-94f8-758e2e2dd395" (UID: "b0a88d02-ab89-44be-94f8-758e2e2dd395"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.060360 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0a88d02-ab89-44be-94f8-758e2e2dd395" (UID: "b0a88d02-ab89-44be-94f8-758e2e2dd395"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.092739 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.120149 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.120182 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sxzm\" (UniqueName: \"kubernetes.io/projected/b0a88d02-ab89-44be-94f8-758e2e2dd395-kube-api-access-7sxzm\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.120194 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.120203 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.120212 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.120220 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a88d02-ab89-44be-94f8-758e2e2dd395-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.207851 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9r7st" event={"ID":"b0a88d02-ab89-44be-94f8-758e2e2dd395","Type":"ContainerDied","Data":"490cbda36b4992b2861ba5f26bc7171a08da9fab2002ecef90e7ca0300334c20"} Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.207904 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9r7st" Mar 10 15:26:59 crc kubenswrapper[4743]: I0310 15:26:59.207922 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490cbda36b4992b2861ba5f26bc7171a08da9fab2002ecef90e7ca0300334c20" Mar 10 15:26:59 crc kubenswrapper[4743]: E0310 15:26:59.209981 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-lc42v" podUID="8ef9806a-40c1-468d-92d8-70e92819f27b" Mar 10 15:26:59 crc kubenswrapper[4743]: E0310 15:26:59.277958 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 10 15:26:59 crc kubenswrapper[4743]: E0310 15:26:59.278121 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b9h555h7bh8h5cdh667h5b8h687h66dh564h674h5dhf9h5cdh68bhdfh544h658h546h5d8h59h5h67dh694hbbh557hf5h88h678hdbh698h667q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hbmzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(482b3103-f6d6-410f-9106-b10ad1695c78): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.013411 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9r7st"] Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.021620 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9r7st"] Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.114302 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8fnbh"] Mar 10 15:27:00 crc kubenswrapper[4743]: E0310 15:27:00.114971 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a88d02-ab89-44be-94f8-758e2e2dd395" containerName="keystone-bootstrap" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.114999 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a88d02-ab89-44be-94f8-758e2e2dd395" containerName="keystone-bootstrap" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.115291 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a88d02-ab89-44be-94f8-758e2e2dd395" containerName="keystone-bootstrap" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.116240 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.119579 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wwgbc" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.120621 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.121132 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.121183 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.121364 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.127318 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8fnbh"] Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.153553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl556\" (UniqueName: \"kubernetes.io/projected/57d812f1-305a-40e9-b8ff-51b5e640ca57-kube-api-access-sl556\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.154021 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-fernet-keys\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.154109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-config-data\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.154153 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-credential-keys\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.154363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-scripts\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.154472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-combined-ca-bundle\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.216637 4743 generic.go:334] "Generic (PLEG): container finished" podID="88a25471-c13d-434e-9f74-82de0cd19099" containerID="d3dc28fde989f285e2d3ddd3cb9f48e1356b0d71457185b391360afe8cbe343f" exitCode=0 Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.216692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4hbxs" event={"ID":"88a25471-c13d-434e-9f74-82de0cd19099","Type":"ContainerDied","Data":"d3dc28fde989f285e2d3ddd3cb9f48e1356b0d71457185b391360afe8cbe343f"} Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.257413 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-scripts\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.257506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-combined-ca-bundle\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.257619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl556\" (UniqueName: \"kubernetes.io/projected/57d812f1-305a-40e9-b8ff-51b5e640ca57-kube-api-access-sl556\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.257693 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-fernet-keys\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.257720 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-config-data\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.257739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-credential-keys\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.264464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-fernet-keys\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.264498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-scripts\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.264547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-credential-keys\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.264595 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-combined-ca-bundle\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.266036 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-config-data\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.277953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl556\" (UniqueName: \"kubernetes.io/projected/57d812f1-305a-40e9-b8ff-51b5e640ca57-kube-api-access-sl556\") pod \"keystone-bootstrap-8fnbh\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:00 crc kubenswrapper[4743]: I0310 15:27:00.458985 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:01 crc kubenswrapper[4743]: I0310 15:27:01.928440 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a88d02-ab89-44be-94f8-758e2e2dd395" path="/var/lib/kubelet/pods/b0a88d02-ab89-44be-94f8-758e2e2dd395/volumes" Mar 10 15:27:02 crc kubenswrapper[4743]: E0310 15:27:02.237901 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 10 15:27:02 crc kubenswrapper[4743]: E0310 15:27:02.238084 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbppw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-z94q2_openstack(d7ff1eff-8355-40a9-b02d-cfb47e08bb46): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:27:02 crc kubenswrapper[4743]: E0310 15:27:02.239198 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-z94q2" podUID="d7ff1eff-8355-40a9-b02d-cfb47e08bb46" Mar 10 15:27:02 crc kubenswrapper[4743]: E0310 15:27:02.262071 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 10 15:27:02 crc kubenswrapper[4743]: E0310 15:27:02.262248 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n544h688h54bh66bh64ch66h66dh668hcfh6dh5c6hbfhd9h5bbh5c6hd5h689h5f6h5dfh5b6h548h5c6hbchc9hb9hf5h5d4h5d5h649hc6h57ch5dfq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqd52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-bf9878dbc-v8nn7_openstack(744d5501-5ab3-4086-acc8-9f7a01ca8513): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:27:02 crc kubenswrapper[4743]: E0310 15:27:02.267134 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-bf9878dbc-v8nn7" podUID="744d5501-5ab3-4086-acc8-9f7a01ca8513" Mar 10 15:27:03 crc kubenswrapper[4743]: E0310 15:27:03.259037 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-z94q2" podUID="d7ff1eff-8355-40a9-b02d-cfb47e08bb46" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.093087 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.093249 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:27:04 crc kubenswrapper[4743]: E0310 15:27:04.564620 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 10 15:27:04 crc kubenswrapper[4743]: E0310 15:27:04.565086 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n589hbchdbh56bh68bh56bh597h5h66ch86hdbh9h65h696h559h5cfh57fh89h5c4h5b5h5f4h5b6h669h5c6h58fhcdh5f6h75hf6h556h5dhbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gplx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-57cf99654f-fqjnm_openstack(fab83d2d-0f12-4be4-9b5e-41594df332ed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:27:04 crc kubenswrapper[4743]: E0310 15:27:04.567248 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-57cf99654f-fqjnm" podUID="fab83d2d-0f12-4be4-9b5e-41594df332ed" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.639878 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.661916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-config-data\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.661979 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-combined-ca-bundle\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.662026 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pktb2\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-kube-api-access-pktb2\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.662062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-logs\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.662089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.662110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-internal-tls-certs\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.662144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-ceph\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.662246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-httpd-run\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.662267 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-scripts\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.666465 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.666697 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-logs" (OuterVolumeSpecName: "logs") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.688853 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.688933 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-ceph" (OuterVolumeSpecName: "ceph") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.688945 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-scripts" (OuterVolumeSpecName: "scripts") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.688972 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-kube-api-access-pktb2" (OuterVolumeSpecName: "kube-api-access-pktb2") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "kube-api-access-pktb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.718585 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.768931 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.769235 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-internal-tls-certs\") pod \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\" (UID: \"95bd9860-cffc-40c7-8edc-7e148f9d0af4\") " Mar 10 15:27:04 crc kubenswrapper[4743]: W0310 15:27:04.769411 4743 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/95bd9860-cffc-40c7-8edc-7e148f9d0af4/volumes/kubernetes.io~secret/internal-tls-certs Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.769426 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.770118 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.770136 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.770148 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.770161 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pktb2\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-kube-api-access-pktb2\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.770174 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95bd9860-cffc-40c7-8edc-7e148f9d0af4-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.770207 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.770217 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.770225 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/95bd9860-cffc-40c7-8edc-7e148f9d0af4-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.778753 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-config-data" (OuterVolumeSpecName: "config-data") pod "95bd9860-cffc-40c7-8edc-7e148f9d0af4" (UID: "95bd9860-cffc-40c7-8edc-7e148f9d0af4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.798804 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.871706 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95bd9860-cffc-40c7-8edc-7e148f9d0af4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4743]: I0310 15:27:04.872082 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.273627 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95bd9860-cffc-40c7-8edc-7e148f9d0af4","Type":"ContainerDied","Data":"c13a8203e8bfa8f954e4314f786c0bb3b4fda8c1f4054ca9b73cecd93b20ce7a"} Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.273684 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.273686 4743 scope.go:117] "RemoveContainer" containerID="f006f53a8dbbc19008d420e0abfdcdbc2068e3a5979cca2ea1721774a8e2c209" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.326517 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.334409 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.349970 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:05 crc kubenswrapper[4743]: E0310 15:27:05.350441 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerName="glance-log" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.350464 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerName="glance-log" Mar 10 15:27:05 crc kubenswrapper[4743]: E0310 15:27:05.350480 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerName="glance-httpd" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.350491 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerName="glance-httpd" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.350719 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerName="glance-log" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.350753 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" containerName="glance-httpd" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.351768 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.359037 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.359389 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.373435 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486580 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-ceph\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-logs\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486802 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486859 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvb4\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-kube-api-access-4gvb4\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.486875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.589709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvb4\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-kube-api-access-4gvb4\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.589763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.589833 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.589900 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.589928 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-ceph\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.589953 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-logs\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.589988 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.590016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.590072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.590607 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.590923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-logs\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.595489 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.595766 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.602785 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.604664 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-ceph\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.607115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.610906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvb4\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-kube-api-access-4gvb4\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.610912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.631182 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.675257 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:05 crc kubenswrapper[4743]: I0310 15:27:05.929375 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95bd9860-cffc-40c7-8edc-7e148f9d0af4" path="/var/lib/kubelet/pods/95bd9860-cffc-40c7-8edc-7e148f9d0af4/volumes" Mar 10 15:27:11 crc kubenswrapper[4743]: I0310 15:27:11.984894 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:27:11 crc kubenswrapper[4743]: I0310 15:27:11.985306 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.231364 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.238172 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.252021 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.340011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"490b9073-0616-4d54-8383-f9eeb519c605","Type":"ContainerDied","Data":"90a7514e5adbd87cd547762091eee9607225a35cc1c62af9107c5235458f6f91"} Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.340052 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.344121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bf9878dbc-v8nn7" event={"ID":"744d5501-5ab3-4086-acc8-9f7a01ca8513","Type":"ContainerDied","Data":"e098227b669f1414a0778b8e70ecfefa42d13d98e0c97833041b8855e5b87e19"} Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.344127 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bf9878dbc-v8nn7" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.346041 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4hbxs" event={"ID":"88a25471-c13d-434e-9f74-82de0cd19099","Type":"ContainerDied","Data":"ebccf062f0f8a734dcf3f355ca44fbc66335459db95e8454942408e8b2370bed"} Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.346147 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebccf062f0f8a734dcf3f355ca44fbc66335459db95e8454942408e8b2370bed" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.346271 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4hbxs" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361653 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744d5501-5ab3-4086-acc8-9f7a01ca8513-logs\") pod \"744d5501-5ab3-4086-acc8-9f7a01ca8513\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-config\") pod \"88a25471-c13d-434e-9f74-82de0cd19099\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361778 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p5hm\" (UniqueName: \"kubernetes.io/projected/88a25471-c13d-434e-9f74-82de0cd19099-kube-api-access-6p5hm\") pod \"88a25471-c13d-434e-9f74-82de0cd19099\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361837 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-scripts\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361859 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvvcl\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-kube-api-access-bvvcl\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361894 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-config-data\") pod \"744d5501-5ab3-4086-acc8-9f7a01ca8513\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361932 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-config-data\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361953 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqd52\" (UniqueName: \"kubernetes.io/projected/744d5501-5ab3-4086-acc8-9f7a01ca8513-kube-api-access-mqd52\") pod \"744d5501-5ab3-4086-acc8-9f7a01ca8513\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.361973 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-public-tls-certs\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.362030 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-combined-ca-bundle\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.362073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-scripts\") pod \"744d5501-5ab3-4086-acc8-9f7a01ca8513\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.362118 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/744d5501-5ab3-4086-acc8-9f7a01ca8513-horizon-secret-key\") pod \"744d5501-5ab3-4086-acc8-9f7a01ca8513\" (UID: \"744d5501-5ab3-4086-acc8-9f7a01ca8513\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.362175 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.362209 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-httpd-run\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.362247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-combined-ca-bundle\") pod \"88a25471-c13d-434e-9f74-82de0cd19099\" (UID: \"88a25471-c13d-434e-9f74-82de0cd19099\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.362301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-logs\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.362349 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-ceph\") pod \"490b9073-0616-4d54-8383-f9eeb519c605\" (UID: \"490b9073-0616-4d54-8383-f9eeb519c605\") " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.363122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/744d5501-5ab3-4086-acc8-9f7a01ca8513-logs" (OuterVolumeSpecName: "logs") pod "744d5501-5ab3-4086-acc8-9f7a01ca8513" (UID: "744d5501-5ab3-4086-acc8-9f7a01ca8513"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.364100 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-logs" (OuterVolumeSpecName: "logs") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.367715 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-scripts" (OuterVolumeSpecName: "scripts") pod "744d5501-5ab3-4086-acc8-9f7a01ca8513" (UID: "744d5501-5ab3-4086-acc8-9f7a01ca8513"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.367926 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744d5501-5ab3-4086-acc8-9f7a01ca8513-kube-api-access-mqd52" (OuterVolumeSpecName: "kube-api-access-mqd52") pod "744d5501-5ab3-4086-acc8-9f7a01ca8513" (UID: "744d5501-5ab3-4086-acc8-9f7a01ca8513"). InnerVolumeSpecName "kube-api-access-mqd52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.368720 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.369006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-config-data" (OuterVolumeSpecName: "config-data") pod "744d5501-5ab3-4086-acc8-9f7a01ca8513" (UID: "744d5501-5ab3-4086-acc8-9f7a01ca8513"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.369186 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-ceph" (OuterVolumeSpecName: "ceph") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.372147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-kube-api-access-bvvcl" (OuterVolumeSpecName: "kube-api-access-bvvcl") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "kube-api-access-bvvcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.372192 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-scripts" (OuterVolumeSpecName: "scripts") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.372842 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/744d5501-5ab3-4086-acc8-9f7a01ca8513-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "744d5501-5ab3-4086-acc8-9f7a01ca8513" (UID: "744d5501-5ab3-4086-acc8-9f7a01ca8513"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.372855 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.378147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a25471-c13d-434e-9f74-82de0cd19099-kube-api-access-6p5hm" (OuterVolumeSpecName: "kube-api-access-6p5hm") pod "88a25471-c13d-434e-9f74-82de0cd19099" (UID: "88a25471-c13d-434e-9f74-82de0cd19099"). InnerVolumeSpecName "kube-api-access-6p5hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.399552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-config" (OuterVolumeSpecName: "config") pod "88a25471-c13d-434e-9f74-82de0cd19099" (UID: "88a25471-c13d-434e-9f74-82de0cd19099"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.399765 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88a25471-c13d-434e-9f74-82de0cd19099" (UID: "88a25471-c13d-434e-9f74-82de0cd19099"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.402458 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.423411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.426348 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-config-data" (OuterVolumeSpecName: "config-data") pod "490b9073-0616-4d54-8383-f9eeb519c605" (UID: "490b9073-0616-4d54-8383-f9eeb519c605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464802 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464865 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p5hm\" (UniqueName: \"kubernetes.io/projected/88a25471-c13d-434e-9f74-82de0cd19099-kube-api-access-6p5hm\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464876 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464884 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvvcl\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-kube-api-access-bvvcl\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464914 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464925 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464933 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqd52\" (UniqueName: \"kubernetes.io/projected/744d5501-5ab3-4086-acc8-9f7a01ca8513-kube-api-access-mqd52\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464941 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464949 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490b9073-0616-4d54-8383-f9eeb519c605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464957 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/744d5501-5ab3-4086-acc8-9f7a01ca8513-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.464964 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/744d5501-5ab3-4086-acc8-9f7a01ca8513-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.465010 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.465019 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.465027 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a25471-c13d-434e-9f74-82de0cd19099-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.465035 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490b9073-0616-4d54-8383-f9eeb519c605-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.465042 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/490b9073-0616-4d54-8383-f9eeb519c605-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.465050 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744d5501-5ab3-4086-acc8-9f7a01ca8513-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.484520 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.567131 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.687316 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.699951 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.730658 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:12 crc kubenswrapper[4743]: E0310 15:27:12.731171 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a25471-c13d-434e-9f74-82de0cd19099" containerName="neutron-db-sync" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.731193 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a25471-c13d-434e-9f74-82de0cd19099" containerName="neutron-db-sync" Mar 10 15:27:12 crc kubenswrapper[4743]: E0310 15:27:12.731215 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490b9073-0616-4d54-8383-f9eeb519c605" containerName="glance-httpd" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.731222 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="490b9073-0616-4d54-8383-f9eeb519c605" containerName="glance-httpd" Mar 10 15:27:12 crc kubenswrapper[4743]: E0310 15:27:12.731243 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490b9073-0616-4d54-8383-f9eeb519c605" containerName="glance-log" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.731249 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="490b9073-0616-4d54-8383-f9eeb519c605" containerName="glance-log" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.731425 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="490b9073-0616-4d54-8383-f9eeb519c605" containerName="glance-httpd" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.731441 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a25471-c13d-434e-9f74-82de0cd19099" containerName="neutron-db-sync" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.731453 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="490b9073-0616-4d54-8383-f9eeb519c605" containerName="glance-log" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.732550 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.740916 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.741313 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.751786 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.760939 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bf9878dbc-v8nn7"] Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.775315 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bf9878dbc-v8nn7"] Mar 10 15:27:12 crc kubenswrapper[4743]: E0310 15:27:12.868842 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-manila-api:current-podified" Mar 10 15:27:12 crc kubenswrapper[4743]: E0310 15:27:12.869017 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manila-db-sync,Image:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,Command:[/bin/bash],Args:[-c sleep 0 && /usr/bin/manila-manage --config-dir /etc/manila/manila.conf.d db sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:job-config-data,ReadOnly:true,MountPath:/etc/manila/manila.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rpq2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42429,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42429,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-db-sync-lgtjg_openstack(45477f7e-f216-40fb-acdb-d7a1dbadba99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.873950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-config-data\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.874194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-logs\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.874312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.874406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.874502 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-scripts\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.874610 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.874699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-ceph\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.875721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.875953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77s72\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-kube-api-access-77s72\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: E0310 15:27:12.876453 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/manila-db-sync-lgtjg" podUID="45477f7e-f216-40fb-acdb-d7a1dbadba99" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.930346 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.937121 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-logs\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-config-data\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996467 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-scripts\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996524 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996560 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-ceph\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996597 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77s72\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-kube-api-access-77s72\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.996761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-logs\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.997152 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 15:27:12 crc kubenswrapper[4743]: I0310 15:27:12.997667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.002270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-ceph\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.003097 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.006198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-config-data\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.012011 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-scripts\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.031513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.035891 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77s72\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-kube-api-access-77s72\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.059111 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.097985 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-config\") pod \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098069 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gplx\" (UniqueName: \"kubernetes.io/projected/fab83d2d-0f12-4be4-9b5e-41594df332ed-kube-api-access-5gplx\") pod \"fab83d2d-0f12-4be4-9b5e-41594df332ed\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab83d2d-0f12-4be4-9b5e-41594df332ed-logs\") pod \"fab83d2d-0f12-4be4-9b5e-41594df332ed\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098149 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-config-data\") pod \"fab83d2d-0f12-4be4-9b5e-41594df332ed\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098166 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-swift-storage-0\") pod \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098182 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-svc\") pod \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098209 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fab83d2d-0f12-4be4-9b5e-41594df332ed-horizon-secret-key\") pod \"fab83d2d-0f12-4be4-9b5e-41594df332ed\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098228 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-nb\") pod \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098281 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t89x\" (UniqueName: \"kubernetes.io/projected/219b9d1c-8e83-4f19-9163-ebb0ef8df490-kube-api-access-6t89x\") pod \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-sb\") pod \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\" (UID: \"219b9d1c-8e83-4f19-9163-ebb0ef8df490\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.098497 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-scripts\") pod \"fab83d2d-0f12-4be4-9b5e-41594df332ed\" (UID: \"fab83d2d-0f12-4be4-9b5e-41594df332ed\") " Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.100617 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-config-data" (OuterVolumeSpecName: "config-data") pod "fab83d2d-0f12-4be4-9b5e-41594df332ed" (UID: "fab83d2d-0f12-4be4-9b5e-41594df332ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.101035 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-scripts" (OuterVolumeSpecName: "scripts") pod "fab83d2d-0f12-4be4-9b5e-41594df332ed" (UID: "fab83d2d-0f12-4be4-9b5e-41594df332ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.101120 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab83d2d-0f12-4be4-9b5e-41594df332ed-logs" (OuterVolumeSpecName: "logs") pod "fab83d2d-0f12-4be4-9b5e-41594df332ed" (UID: "fab83d2d-0f12-4be4-9b5e-41594df332ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.103460 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab83d2d-0f12-4be4-9b5e-41594df332ed-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fab83d2d-0f12-4be4-9b5e-41594df332ed" (UID: "fab83d2d-0f12-4be4-9b5e-41594df332ed"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.106980 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab83d2d-0f12-4be4-9b5e-41594df332ed-kube-api-access-5gplx" (OuterVolumeSpecName: "kube-api-access-5gplx") pod "fab83d2d-0f12-4be4-9b5e-41594df332ed" (UID: "fab83d2d-0f12-4be4-9b5e-41594df332ed"). InnerVolumeSpecName "kube-api-access-5gplx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.109341 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219b9d1c-8e83-4f19-9163-ebb0ef8df490-kube-api-access-6t89x" (OuterVolumeSpecName: "kube-api-access-6t89x") pod "219b9d1c-8e83-4f19-9163-ebb0ef8df490" (UID: "219b9d1c-8e83-4f19-9163-ebb0ef8df490"). InnerVolumeSpecName "kube-api-access-6t89x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.171695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "219b9d1c-8e83-4f19-9163-ebb0ef8df490" (UID: "219b9d1c-8e83-4f19-9163-ebb0ef8df490"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.176358 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "219b9d1c-8e83-4f19-9163-ebb0ef8df490" (UID: "219b9d1c-8e83-4f19-9163-ebb0ef8df490"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.176893 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-config" (OuterVolumeSpecName: "config") pod "219b9d1c-8e83-4f19-9163-ebb0ef8df490" (UID: "219b9d1c-8e83-4f19-9163-ebb0ef8df490"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.180977 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "219b9d1c-8e83-4f19-9163-ebb0ef8df490" (UID: "219b9d1c-8e83-4f19-9163-ebb0ef8df490"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.189026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "219b9d1c-8e83-4f19-9163-ebb0ef8df490" (UID: "219b9d1c-8e83-4f19-9163-ebb0ef8df490"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201544 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201587 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201603 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gplx\" (UniqueName: \"kubernetes.io/projected/fab83d2d-0f12-4be4-9b5e-41594df332ed-kube-api-access-5gplx\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201615 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab83d2d-0f12-4be4-9b5e-41594df332ed-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201628 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fab83d2d-0f12-4be4-9b5e-41594df332ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201642 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201654 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201664 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fab83d2d-0f12-4be4-9b5e-41594df332ed-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201675 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201689 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t89x\" (UniqueName: \"kubernetes.io/projected/219b9d1c-8e83-4f19-9163-ebb0ef8df490-kube-api-access-6t89x\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.201700 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219b9d1c-8e83-4f19-9163-ebb0ef8df490-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.358357 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.360191 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" event={"ID":"219b9d1c-8e83-4f19-9163-ebb0ef8df490","Type":"ContainerDied","Data":"14f8d54ad686e3d155d897471f2a8d5a879196a663a8b9c789ea102c7dc37e8f"} Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.360242 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.362288 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57cf99654f-fqjnm" Mar 10 15:27:13 crc kubenswrapper[4743]: E0310 15:27:13.369166 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-manila-api:current-podified\\\"\"" pod="openstack/manila-db-sync-lgtjg" podUID="45477f7e-f216-40fb-acdb-d7a1dbadba99" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.369462 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57cf99654f-fqjnm" event={"ID":"fab83d2d-0f12-4be4-9b5e-41594df332ed","Type":"ContainerDied","Data":"e1b5343b4dca3ceb1633a32273b87618f7240374e1988adb7722fab897b95a81"} Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.486548 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57cf99654f-fqjnm"] Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.508208 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57cf99654f-fqjnm"] Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.524111 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-dmqvp"] Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.545158 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-dmqvp"] Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.590794 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-jzmg2"] Mar 10 15:27:13 crc kubenswrapper[4743]: E0310 15:27:13.591269 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="init" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.591287 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="init" Mar 10 15:27:13 crc kubenswrapper[4743]: E0310 15:27:13.591315 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="dnsmasq-dns" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.591322 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="dnsmasq-dns" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.591501 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="dnsmasq-dns" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.592491 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.613665 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-jzmg2"] Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.725111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-config\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.725210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.725243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-svc\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.725270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.725390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.725465 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcsx4\" (UniqueName: \"kubernetes.io/projected/1d51c7ac-111d-46e8-903f-01f29e4221ac-kube-api-access-mcsx4\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.800614 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75f4f5966d-fg8q8"] Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.810649 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.828491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.828706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-svc\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.834272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.834580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.834878 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcsx4\" (UniqueName: \"kubernetes.io/projected/1d51c7ac-111d-46e8-903f-01f29e4221ac-kube-api-access-mcsx4\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.835050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-config\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.836033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-config\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.830504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-svc\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.836806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.830751 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.837404 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.837687 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2m64d" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.831238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.831418 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.845545 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.846188 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75f4f5966d-fg8q8"] Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.896503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcsx4\" (UniqueName: \"kubernetes.io/projected/1d51c7ac-111d-46e8-903f-01f29e4221ac-kube-api-access-mcsx4\") pod \"dnsmasq-dns-6b7b667979-jzmg2\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.919001 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.936371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-ovndb-tls-certs\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.936412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dgl8\" (UniqueName: \"kubernetes.io/projected/ea014130-54ae-451e-a870-aacb43d98f25-kube-api-access-5dgl8\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.936489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-config\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.936518 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-combined-ca-bundle\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.936587 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-httpd-config\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.941148 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" path="/var/lib/kubelet/pods/219b9d1c-8e83-4f19-9163-ebb0ef8df490/volumes" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.942244 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490b9073-0616-4d54-8383-f9eeb519c605" path="/var/lib/kubelet/pods/490b9073-0616-4d54-8383-f9eeb519c605/volumes" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.950305 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="744d5501-5ab3-4086-acc8-9f7a01ca8513" path="/var/lib/kubelet/pods/744d5501-5ab3-4086-acc8-9f7a01ca8513/volumes" Mar 10 15:27:13 crc kubenswrapper[4743]: I0310 15:27:13.951004 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab83d2d-0f12-4be4-9b5e-41594df332ed" path="/var/lib/kubelet/pods/fab83d2d-0f12-4be4-9b5e-41594df332ed/volumes" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.039084 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-httpd-config\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.039388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-ovndb-tls-certs\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.039422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dgl8\" (UniqueName: \"kubernetes.io/projected/ea014130-54ae-451e-a870-aacb43d98f25-kube-api-access-5dgl8\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.039530 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-config\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.039575 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-combined-ca-bundle\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.052002 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-ovndb-tls-certs\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.054530 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-httpd-config\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.063580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-combined-ca-bundle\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.080547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-config\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.096044 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-dmqvp" podUID="219b9d1c-8e83-4f19-9163-ebb0ef8df490" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: i/o timeout" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.096630 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dgl8\" (UniqueName: \"kubernetes.io/projected/ea014130-54ae-451e-a870-aacb43d98f25-kube-api-access-5dgl8\") pod \"neutron-75f4f5966d-fg8q8\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: I0310 15:27:14.134516 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:14 crc kubenswrapper[4743]: E0310 15:27:14.963617 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 10 15:27:14 crc kubenswrapper[4743]: E0310 15:27:14.963790 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dd5rg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-86f6w_openstack(68a99e3b-6d76-485c-b284-5f275ba9bbef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:27:14 crc kubenswrapper[4743]: E0310 15:27:14.964968 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-86f6w" podUID="68a99e3b-6d76-485c-b284-5f275ba9bbef" Mar 10 15:27:15 crc kubenswrapper[4743]: I0310 15:27:15.305279 4743 scope.go:117] "RemoveContainer" containerID="625b14be5b44ebeb352f02f37ff237f071c8ed12f0b87f1e6c062da83d05743e" Mar 10 15:27:15 crc kubenswrapper[4743]: E0310 15:27:15.489752 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-86f6w" podUID="68a99e3b-6d76-485c-b284-5f275ba9bbef" Mar 10 15:27:15 crc kubenswrapper[4743]: I0310 15:27:15.614047 4743 scope.go:117] "RemoveContainer" containerID="4950466aabda7b273bc607f3f199f00342d0a413584aca09688ce8ad8c76fa59" Mar 10 15:27:15 crc kubenswrapper[4743]: I0310 15:27:15.701052 4743 scope.go:117] "RemoveContainer" containerID="5b8243bc446a56930e8c070f7aeb7f03280904b8fca0ad33289ec0971c5b3d99" Mar 10 15:27:15 crc kubenswrapper[4743]: I0310 15:27:15.788444 4743 scope.go:117] "RemoveContainer" containerID="c0bb05b0860d33025afaa2609668f2d6dbd274f9a1c8c8e0727fd082c1e893d4" Mar 10 15:27:15 crc kubenswrapper[4743]: I0310 15:27:15.865449 4743 scope.go:117] "RemoveContainer" containerID="982511c686a9bee8ccd48256adb2e7a9032ce7e3d71f305e642c126cc1bacb74" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.071913 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-958fd895b-mxn2t"] Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.072206 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5855c85b77-4c45c"] Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.073991 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5855c85b77-4c45c"] Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.074074 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.101487 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.103059 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.158885 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7954db6464-ns5cf"] Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.240669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-httpd-config\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.240733 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmkbv\" (UniqueName: \"kubernetes.io/projected/94f36e37-4d73-48c9-a64b-810a95ed7bac-kube-api-access-qmkbv\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.240779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-combined-ca-bundle\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.240840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-internal-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.240872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-config\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.240911 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-ovndb-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.240961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-public-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.332536 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8fnbh"] Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.342194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-public-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.342293 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-httpd-config\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.342318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmkbv\" (UniqueName: \"kubernetes.io/projected/94f36e37-4d73-48c9-a64b-810a95ed7bac-kube-api-access-qmkbv\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.342365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-combined-ca-bundle\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.342402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-internal-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.342422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-config\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.342452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-ovndb-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.353587 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-ovndb-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.355896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-internal-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.357093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-combined-ca-bundle\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.360048 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-httpd-config\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.361657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-config\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.368550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-public-tls-certs\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.368699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmkbv\" (UniqueName: \"kubernetes.io/projected/94f36e37-4d73-48c9-a64b-810a95ed7bac-kube-api-access-qmkbv\") pod \"neutron-5855c85b77-4c45c\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.430053 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.480284 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-jzmg2"] Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.497049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c58bbcd67-dxpcc" event={"ID":"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b","Type":"ContainerStarted","Data":"3690d14ef39973676eb199ac0c743ebaacea85a3090a411e71de0ba1a770ca3e"} Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.572239 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-958fd895b-mxn2t" event={"ID":"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7","Type":"ContainerStarted","Data":"3662866b8fc6aebf1b1da766855e1936f17ac8a769bbdaba5374c2a8845c6775"} Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.583641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"482b3103-f6d6-410f-9106-b10ad1695c78","Type":"ContainerStarted","Data":"3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9"} Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.596157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lc42v" event={"ID":"8ef9806a-40c1-468d-92d8-70e92819f27b","Type":"ContainerStarted","Data":"38fa0812a7aade8e46c137973d40837bc5097536eaa054526cfa660d01b7eb39"} Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.598077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7954db6464-ns5cf" event={"ID":"c0001988-feba-4afe-9068-071af12a6fd7","Type":"ContainerStarted","Data":"5d57ec9d60f3343ab1f3ba5e638d1d3474bbd96f2b84d503fe676d5e894917e6"} Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.613861 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8fnbh" event={"ID":"57d812f1-305a-40e9-b8ff-51b5e640ca57","Type":"ContainerStarted","Data":"87b05d0a915b8d7697527c49a0ff228fd20a54c033066d1c067aa833d2e738fb"} Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.627444 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-lc42v" podStartSLOduration=3.646639747 podStartE2EDuration="35.627424389s" podCreationTimestamp="2026-03-10 15:26:41 +0000 UTC" firstStartedPulling="2026-03-10 15:26:43.617212881 +0000 UTC m=+1268.324027619" lastFinishedPulling="2026-03-10 15:27:15.597997513 +0000 UTC m=+1300.304812261" observedRunningTime="2026-03-10 15:27:16.623114246 +0000 UTC m=+1301.329928994" watchObservedRunningTime="2026-03-10 15:27:16.627424389 +0000 UTC m=+1301.334239137" Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.663124 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:16 crc kubenswrapper[4743]: I0310 15:27:16.845556 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.180972 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75f4f5966d-fg8q8"] Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.337601 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5855c85b77-4c45c"] Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.659872 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5855c85b77-4c45c" event={"ID":"94f36e37-4d73-48c9-a64b-810a95ed7bac","Type":"ContainerStarted","Data":"7717c70c1c67fdcf176cad5dc51ce08a1c89056a8d83a8e6bbfc1a1f445c980d"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.663362 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerID="81f3e686873c718856a3843947499b0170d2c149422e608f2e65d5bc0e28f3b8" exitCode=0 Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.663432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" event={"ID":"1d51c7ac-111d-46e8-903f-01f29e4221ac","Type":"ContainerDied","Data":"81f3e686873c718856a3843947499b0170d2c149422e608f2e65d5bc0e28f3b8"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.663473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" event={"ID":"1d51c7ac-111d-46e8-903f-01f29e4221ac","Type":"ContainerStarted","Data":"09c37282c4b1299bfe0ff3c4a9b8f7346ea19cf3073fb3c36c9712c02b64ee2d"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.665001 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f4f5966d-fg8q8" event={"ID":"ea014130-54ae-451e-a870-aacb43d98f25","Type":"ContainerStarted","Data":"c2848a8283f1c8452c92e1cbd25219a8af27aead1be8d49904aeba086de09651"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.671629 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07af3cb-3345-42d9-86c1-38bfbc27a259","Type":"ContainerStarted","Data":"a3fb04a542e234f93c972f918f7d5a7eab27a87b8e1025651754c0e57e8b7cb0"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.676648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7954db6464-ns5cf" event={"ID":"c0001988-feba-4afe-9068-071af12a6fd7","Type":"ContainerStarted","Data":"1a4a17b7c6b5e58ecc54899710b1a069b044db12fac0070f22bb4b2126726a01"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.678410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-958fd895b-mxn2t" event={"ID":"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7","Type":"ContainerStarted","Data":"fc356e0b5f59a9858baaa77354891643a1657e283afba3367188097329228099"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.680359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8fnbh" event={"ID":"57d812f1-305a-40e9-b8ff-51b5e640ca57","Type":"ContainerStarted","Data":"fe10c62081917d54d95e30624b900758b372fc23fe1e0af34141218b0944e109"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.693793 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eedca3dd-44be-41bf-b7e5-7ac48e4e7264","Type":"ContainerStarted","Data":"14f383569976f9c3b33ee7512ba1bba581411bd6590de6876304b5a23454e84a"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.709807 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8fnbh" podStartSLOduration=17.709787823 podStartE2EDuration="17.709787823s" podCreationTimestamp="2026-03-10 15:27:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:17.705295025 +0000 UTC m=+1302.412109773" watchObservedRunningTime="2026-03-10 15:27:17.709787823 +0000 UTC m=+1302.416602581" Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.732249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c58bbcd67-dxpcc" event={"ID":"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b","Type":"ContainerStarted","Data":"68119a62955bb4be3b995ec2294bc6a9da4607d114cadc47e5a26edfc51ddbb3"} Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.732400 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c58bbcd67-dxpcc" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerName="horizon-log" containerID="cri-o://3690d14ef39973676eb199ac0c743ebaacea85a3090a411e71de0ba1a770ca3e" gracePeriod=30 Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.733040 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c58bbcd67-dxpcc" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerName="horizon" containerID="cri-o://68119a62955bb4be3b995ec2294bc6a9da4607d114cadc47e5a26edfc51ddbb3" gracePeriod=30 Mar 10 15:27:17 crc kubenswrapper[4743]: I0310 15:27:17.774188 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c58bbcd67-dxpcc" podStartSLOduration=7.059087439 podStartE2EDuration="33.774170296s" podCreationTimestamp="2026-03-10 15:26:44 +0000 UTC" firstStartedPulling="2026-03-10 15:26:46.176955198 +0000 UTC m=+1270.883769946" lastFinishedPulling="2026-03-10 15:27:12.892038055 +0000 UTC m=+1297.598852803" observedRunningTime="2026-03-10 15:27:17.76441918 +0000 UTC m=+1302.471233928" watchObservedRunningTime="2026-03-10 15:27:17.774170296 +0000 UTC m=+1302.480985044" Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.744593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-958fd895b-mxn2t" event={"ID":"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7","Type":"ContainerStarted","Data":"c8e35c99a898ab5b18b13aae107bf529b1f8d349a368587867349ca074eddf3a"} Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.746792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5855c85b77-4c45c" event={"ID":"94f36e37-4d73-48c9-a64b-810a95ed7bac","Type":"ContainerStarted","Data":"38cb9d98a25a82ad345c96bfd981bf7e7b4553c04e203f99a2a717bb6314dd7d"} Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.747022 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.760410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eedca3dd-44be-41bf-b7e5-7ac48e4e7264","Type":"ContainerStarted","Data":"105ce5d166ea3a22c561a690e615f2485bb91dc82e438a69315e11efdcb92a8e"} Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.782837 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" event={"ID":"1d51c7ac-111d-46e8-903f-01f29e4221ac","Type":"ContainerStarted","Data":"6776dd2441f62d1ab751bcf7bc9237313b93fa8dd5929cf0e5f7477e3ae97d04"} Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.783103 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.797205 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-958fd895b-mxn2t" podStartSLOduration=28.79718233 podStartE2EDuration="28.79718233s" podCreationTimestamp="2026-03-10 15:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:18.768404915 +0000 UTC m=+1303.475219663" watchObservedRunningTime="2026-03-10 15:27:18.79718233 +0000 UTC m=+1303.503997068" Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.802437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f4f5966d-fg8q8" event={"ID":"ea014130-54ae-451e-a870-aacb43d98f25","Type":"ContainerStarted","Data":"192e509108abac65efb1d060ec5da09ac6bf2c544df24eb04d4592ef166a835a"} Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.802530 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.802545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f4f5966d-fg8q8" event={"ID":"ea014130-54ae-451e-a870-aacb43d98f25","Type":"ContainerStarted","Data":"d9aa7fa9d05f03db36a66470ebb238e22cabd8e8e11aaace35086635c13ed054"} Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.805430 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5855c85b77-4c45c" podStartSLOduration=3.805413423 podStartE2EDuration="3.805413423s" podCreationTimestamp="2026-03-10 15:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:18.786403125 +0000 UTC m=+1303.493217873" watchObservedRunningTime="2026-03-10 15:27:18.805413423 +0000 UTC m=+1303.512228171" Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.809938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07af3cb-3345-42d9-86c1-38bfbc27a259","Type":"ContainerStarted","Data":"1e5e910037c86b0214ac4e00c30ffb1b59927eb92cb3557b80803ce966bea0bc"} Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.821511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7954db6464-ns5cf" event={"ID":"c0001988-feba-4afe-9068-071af12a6fd7","Type":"ContainerStarted","Data":"8219c4b41eb7545ea397447365b146eeb777554480ab78e1282ead6e8b54a642"} Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.828838 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" podStartSLOduration=5.828820316 podStartE2EDuration="5.828820316s" podCreationTimestamp="2026-03-10 15:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:18.813693248 +0000 UTC m=+1303.520508016" watchObservedRunningTime="2026-03-10 15:27:18.828820316 +0000 UTC m=+1303.535635064" Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.855337 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75f4f5966d-fg8q8" podStartSLOduration=5.855312326 podStartE2EDuration="5.855312326s" podCreationTimestamp="2026-03-10 15:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:18.834021143 +0000 UTC m=+1303.540835891" watchObservedRunningTime="2026-03-10 15:27:18.855312326 +0000 UTC m=+1303.562127074" Mar 10 15:27:18 crc kubenswrapper[4743]: I0310 15:27:18.870055 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7954db6464-ns5cf" podStartSLOduration=28.870038144 podStartE2EDuration="28.870038144s" podCreationTimestamp="2026-03-10 15:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:18.867362328 +0000 UTC m=+1303.574177096" watchObservedRunningTime="2026-03-10 15:27:18.870038144 +0000 UTC m=+1303.576852892" Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.834232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z94q2" event={"ID":"d7ff1eff-8355-40a9-b02d-cfb47e08bb46","Type":"ContainerStarted","Data":"75244dab8ce6de7ad77790db8fd0420d4fc9fa48f1073a66c7f739ab78cfa955"} Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.837432 4743 generic.go:334] "Generic (PLEG): container finished" podID="8ef9806a-40c1-468d-92d8-70e92819f27b" containerID="38fa0812a7aade8e46c137973d40837bc5097536eaa054526cfa660d01b7eb39" exitCode=0 Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.837481 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lc42v" event={"ID":"8ef9806a-40c1-468d-92d8-70e92819f27b","Type":"ContainerDied","Data":"38fa0812a7aade8e46c137973d40837bc5097536eaa054526cfa660d01b7eb39"} Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.839950 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eedca3dd-44be-41bf-b7e5-7ac48e4e7264","Type":"ContainerStarted","Data":"98b1e11f0be782cc4b452f74cc1e3cca3a597669f37c1f0010806d1bfb98284d"} Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.843257 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07af3cb-3345-42d9-86c1-38bfbc27a259","Type":"ContainerStarted","Data":"04feccecae990e1118c080ef411816b601adfa63b5b3339b6fd74c230ad7d27e"} Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.853599 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-z94q2" podStartSLOduration=4.059911062 podStartE2EDuration="38.85357742s" podCreationTimestamp="2026-03-10 15:26:41 +0000 UTC" firstStartedPulling="2026-03-10 15:26:43.657292106 +0000 UTC m=+1268.364106854" lastFinishedPulling="2026-03-10 15:27:18.450958464 +0000 UTC m=+1303.157773212" observedRunningTime="2026-03-10 15:27:19.849682609 +0000 UTC m=+1304.556497357" watchObservedRunningTime="2026-03-10 15:27:19.85357742 +0000 UTC m=+1304.560392168" Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.856990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5855c85b77-4c45c" event={"ID":"94f36e37-4d73-48c9-a64b-810a95ed7bac","Type":"ContainerStarted","Data":"0a0d1f7f3393d7f31400370f146d4299b8f97dde9d368b1123819a2d7c31d935"} Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.908869 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.908845725 podStartE2EDuration="7.908845725s" podCreationTimestamp="2026-03-10 15:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:19.885604787 +0000 UTC m=+1304.592419525" watchObservedRunningTime="2026-03-10 15:27:19.908845725 +0000 UTC m=+1304.615660473" Mar 10 15:27:19 crc kubenswrapper[4743]: I0310 15:27:19.971187 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.9711603 podStartE2EDuration="14.9711603s" podCreationTimestamp="2026-03-10 15:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:19.96197508 +0000 UTC m=+1304.668789838" watchObservedRunningTime="2026-03-10 15:27:19.9711603 +0000 UTC m=+1304.677975048" Mar 10 15:27:20 crc kubenswrapper[4743]: I0310 15:27:20.748156 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:27:20 crc kubenswrapper[4743]: I0310 15:27:20.748519 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:27:20 crc kubenswrapper[4743]: I0310 15:27:20.830226 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:27:20 crc kubenswrapper[4743]: I0310 15:27:20.830280 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.401344 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lc42v" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.554927 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef9806a-40c1-468d-92d8-70e92819f27b-logs\") pod \"8ef9806a-40c1-468d-92d8-70e92819f27b\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.555355 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4zxg\" (UniqueName: \"kubernetes.io/projected/8ef9806a-40c1-468d-92d8-70e92819f27b-kube-api-access-w4zxg\") pod \"8ef9806a-40c1-468d-92d8-70e92819f27b\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.555401 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef9806a-40c1-468d-92d8-70e92819f27b-logs" (OuterVolumeSpecName: "logs") pod "8ef9806a-40c1-468d-92d8-70e92819f27b" (UID: "8ef9806a-40c1-468d-92d8-70e92819f27b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.555422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-combined-ca-bundle\") pod \"8ef9806a-40c1-468d-92d8-70e92819f27b\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.555466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-scripts\") pod \"8ef9806a-40c1-468d-92d8-70e92819f27b\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.555554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-config-data\") pod \"8ef9806a-40c1-468d-92d8-70e92819f27b\" (UID: \"8ef9806a-40c1-468d-92d8-70e92819f27b\") " Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.556139 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef9806a-40c1-468d-92d8-70e92819f27b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.565077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef9806a-40c1-468d-92d8-70e92819f27b-kube-api-access-w4zxg" (OuterVolumeSpecName: "kube-api-access-w4zxg") pod "8ef9806a-40c1-468d-92d8-70e92819f27b" (UID: "8ef9806a-40c1-468d-92d8-70e92819f27b"). InnerVolumeSpecName "kube-api-access-w4zxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.573474 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-scripts" (OuterVolumeSpecName: "scripts") pod "8ef9806a-40c1-468d-92d8-70e92819f27b" (UID: "8ef9806a-40c1-468d-92d8-70e92819f27b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.588925 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ef9806a-40c1-468d-92d8-70e92819f27b" (UID: "8ef9806a-40c1-468d-92d8-70e92819f27b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.622927 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-config-data" (OuterVolumeSpecName: "config-data") pod "8ef9806a-40c1-468d-92d8-70e92819f27b" (UID: "8ef9806a-40c1-468d-92d8-70e92819f27b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.660305 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4zxg\" (UniqueName: \"kubernetes.io/projected/8ef9806a-40c1-468d-92d8-70e92819f27b-kube-api-access-w4zxg\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.660348 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.660361 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.660372 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef9806a-40c1-468d-92d8-70e92819f27b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.880005 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lc42v" event={"ID":"8ef9806a-40c1-468d-92d8-70e92819f27b","Type":"ContainerDied","Data":"d4e95baf3e35022aed87eb953f753ec5ccaf4742b01d87c1240b04094a621235"} Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.880046 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e95baf3e35022aed87eb953f753ec5ccaf4742b01d87c1240b04094a621235" Mar 10 15:27:21 crc kubenswrapper[4743]: I0310 15:27:21.880073 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lc42v" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.050574 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68c9f99d4d-r9tj5"] Mar 10 15:27:22 crc kubenswrapper[4743]: E0310 15:27:22.051009 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef9806a-40c1-468d-92d8-70e92819f27b" containerName="placement-db-sync" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.051027 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef9806a-40c1-468d-92d8-70e92819f27b" containerName="placement-db-sync" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.051239 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef9806a-40c1-468d-92d8-70e92819f27b" containerName="placement-db-sync" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.052227 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.056662 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.057208 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.057357 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.057496 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.057664 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-79znn" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.071490 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68c9f99d4d-r9tj5"] Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.171085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-public-tls-certs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.171175 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b97e49-cad0-4482-ba00-13ba656ac6cb-logs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.171407 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-combined-ca-bundle\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.171787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-scripts\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.171863 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-internal-tls-certs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.171927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x9cm\" (UniqueName: \"kubernetes.io/projected/48b97e49-cad0-4482-ba00-13ba656ac6cb-kube-api-access-7x9cm\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.172070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-config-data\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.273633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-internal-tls-certs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.273690 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x9cm\" (UniqueName: \"kubernetes.io/projected/48b97e49-cad0-4482-ba00-13ba656ac6cb-kube-api-access-7x9cm\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.273738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-config-data\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.273783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-public-tls-certs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.273837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b97e49-cad0-4482-ba00-13ba656ac6cb-logs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.273886 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-combined-ca-bundle\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.274455 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b97e49-cad0-4482-ba00-13ba656ac6cb-logs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.274665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-scripts\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.282147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-scripts\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.282277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-internal-tls-certs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.283904 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-combined-ca-bundle\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.298432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-config-data\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.326484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-public-tls-certs\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.331090 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x9cm\" (UniqueName: \"kubernetes.io/projected/48b97e49-cad0-4482-ba00-13ba656ac6cb-kube-api-access-7x9cm\") pod \"placement-68c9f99d4d-r9tj5\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.383372 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.898642 4743 generic.go:334] "Generic (PLEG): container finished" podID="57d812f1-305a-40e9-b8ff-51b5e640ca57" containerID="fe10c62081917d54d95e30624b900758b372fc23fe1e0af34141218b0944e109" exitCode=0 Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.898987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8fnbh" event={"ID":"57d812f1-305a-40e9-b8ff-51b5e640ca57","Type":"ContainerDied","Data":"fe10c62081917d54d95e30624b900758b372fc23fe1e0af34141218b0944e109"} Mar 10 15:27:22 crc kubenswrapper[4743]: I0310 15:27:22.903159 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68c9f99d4d-r9tj5"] Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.358700 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.358758 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.407183 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.412777 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.920564 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7ff1eff-8355-40a9-b02d-cfb47e08bb46" containerID="75244dab8ce6de7ad77790db8fd0420d4fc9fa48f1073a66c7f739ab78cfa955" exitCode=0 Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.955800 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z94q2" event={"ID":"d7ff1eff-8355-40a9-b02d-cfb47e08bb46","Type":"ContainerDied","Data":"75244dab8ce6de7ad77790db8fd0420d4fc9fa48f1073a66c7f739ab78cfa955"} Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.955891 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.955907 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:27:23 crc kubenswrapper[4743]: I0310 15:27:23.955948 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:24 crc kubenswrapper[4743]: I0310 15:27:24.173776 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-z7zlr"] Mar 10 15:27:24 crc kubenswrapper[4743]: I0310 15:27:24.178234 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" podUID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerName="dnsmasq-dns" containerID="cri-o://2cc3a28c5a1bba28c6a1fcb245a3a9a5bc8469f4463eda608baaa10ade1d0448" gracePeriod=10 Mar 10 15:27:24 crc kubenswrapper[4743]: I0310 15:27:24.950554 4743 generic.go:334] "Generic (PLEG): container finished" podID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerID="2cc3a28c5a1bba28c6a1fcb245a3a9a5bc8469f4463eda608baaa10ade1d0448" exitCode=0 Mar 10 15:27:24 crc kubenswrapper[4743]: I0310 15:27:24.950616 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" event={"ID":"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43","Type":"ContainerDied","Data":"2cc3a28c5a1bba28c6a1fcb245a3a9a5bc8469f4463eda608baaa10ade1d0448"} Mar 10 15:27:25 crc kubenswrapper[4743]: I0310 15:27:25.384677 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:27:25 crc kubenswrapper[4743]: I0310 15:27:25.676114 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:25 crc kubenswrapper[4743]: I0310 15:27:25.676184 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:25 crc kubenswrapper[4743]: I0310 15:27:25.725779 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:25 crc kubenswrapper[4743]: I0310 15:27:25.737044 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:25 crc kubenswrapper[4743]: I0310 15:27:25.976356 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:27:25 crc kubenswrapper[4743]: I0310 15:27:25.977434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:25 crc kubenswrapper[4743]: I0310 15:27:25.977464 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:28 crc kubenswrapper[4743]: I0310 15:27:28.649128 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 15:27:28 crc kubenswrapper[4743]: I0310 15:27:28.649593 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:27:28 crc kubenswrapper[4743]: I0310 15:27:28.876586 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 15:27:29 crc kubenswrapper[4743]: I0310 15:27:29.032645 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:29 crc kubenswrapper[4743]: I0310 15:27:29.032791 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:27:29 crc kubenswrapper[4743]: I0310 15:27:29.038592 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.419471 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.443613 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.451685 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z94q2" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.502505 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-credential-keys\") pod \"57d812f1-305a-40e9-b8ff-51b5e640ca57\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.502563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl556\" (UniqueName: \"kubernetes.io/projected/57d812f1-305a-40e9-b8ff-51b5e640ca57-kube-api-access-sl556\") pod \"57d812f1-305a-40e9-b8ff-51b5e640ca57\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.503796 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-scripts\") pod \"57d812f1-305a-40e9-b8ff-51b5e640ca57\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.503948 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-config-data\") pod \"57d812f1-305a-40e9-b8ff-51b5e640ca57\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.503982 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-combined-ca-bundle\") pod \"57d812f1-305a-40e9-b8ff-51b5e640ca57\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.504043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-fernet-keys\") pod \"57d812f1-305a-40e9-b8ff-51b5e640ca57\" (UID: \"57d812f1-305a-40e9-b8ff-51b5e640ca57\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.513586 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "57d812f1-305a-40e9-b8ff-51b5e640ca57" (UID: "57d812f1-305a-40e9-b8ff-51b5e640ca57"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.513745 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-scripts" (OuterVolumeSpecName: "scripts") pod "57d812f1-305a-40e9-b8ff-51b5e640ca57" (UID: "57d812f1-305a-40e9-b8ff-51b5e640ca57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.515434 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "57d812f1-305a-40e9-b8ff-51b5e640ca57" (UID: "57d812f1-305a-40e9-b8ff-51b5e640ca57"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.517072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d812f1-305a-40e9-b8ff-51b5e640ca57-kube-api-access-sl556" (OuterVolumeSpecName: "kube-api-access-sl556") pod "57d812f1-305a-40e9-b8ff-51b5e640ca57" (UID: "57d812f1-305a-40e9-b8ff-51b5e640ca57"). InnerVolumeSpecName "kube-api-access-sl556". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.607591 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-db-sync-config-data\") pod \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.611415 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sm6w\" (UniqueName: \"kubernetes.io/projected/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-kube-api-access-7sm6w\") pod \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.617386 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-combined-ca-bundle\") pod \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.617546 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-nb\") pod \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.617677 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-config\") pod \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.617791 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-svc\") pod \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.617890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-swift-storage-0\") pod \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.618020 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-kube-api-access-jbppw\") pod \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\" (UID: \"d7ff1eff-8355-40a9-b02d-cfb47e08bb46\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.618118 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-sb\") pod \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\" (UID: \"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43\") " Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.619647 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.619751 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.619883 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl556\" (UniqueName: \"kubernetes.io/projected/57d812f1-305a-40e9-b8ff-51b5e640ca57-kube-api-access-sl556\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.619948 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.629495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d7ff1eff-8355-40a9-b02d-cfb47e08bb46" (UID: "d7ff1eff-8355-40a9-b02d-cfb47e08bb46"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.629988 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-kube-api-access-7sm6w" (OuterVolumeSpecName: "kube-api-access-7sm6w") pod "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" (UID: "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43"). InnerVolumeSpecName "kube-api-access-7sm6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.682292 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-kube-api-access-jbppw" (OuterVolumeSpecName: "kube-api-access-jbppw") pod "d7ff1eff-8355-40a9-b02d-cfb47e08bb46" (UID: "d7ff1eff-8355-40a9-b02d-cfb47e08bb46"). InnerVolumeSpecName "kube-api-access-jbppw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.722022 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-kube-api-access-jbppw\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.722127 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sm6w\" (UniqueName: \"kubernetes.io/projected/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-kube-api-access-7sm6w\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.722187 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.752839 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.833521 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-958fd895b-mxn2t" podUID="cccf05c8-d4e8-4a1d-912f-5f4a37440ac7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.848742 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-config-data" (OuterVolumeSpecName: "config-data") pod "57d812f1-305a-40e9-b8ff-51b5e640ca57" (UID: "57d812f1-305a-40e9-b8ff-51b5e640ca57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.895977 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57d812f1-305a-40e9-b8ff-51b5e640ca57" (UID: "57d812f1-305a-40e9-b8ff-51b5e640ca57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.927596 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.927656 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d812f1-305a-40e9-b8ff-51b5e640ca57-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:30 crc kubenswrapper[4743]: I0310 15:27:30.952147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7ff1eff-8355-40a9-b02d-cfb47e08bb46" (UID: "d7ff1eff-8355-40a9-b02d-cfb47e08bb46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.017278 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" (UID: "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.029510 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.029557 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff1eff-8355-40a9-b02d-cfb47e08bb46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.040775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z94q2" event={"ID":"d7ff1eff-8355-40a9-b02d-cfb47e08bb46","Type":"ContainerDied","Data":"fa72c4f309f4ed7d545948aacbd01ccc4753a9bcd62a5003c96855587feb5697"} Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.040837 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa72c4f309f4ed7d545948aacbd01ccc4753a9bcd62a5003c96855587feb5697" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.040926 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z94q2" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.044228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68c9f99d4d-r9tj5" event={"ID":"48b97e49-cad0-4482-ba00-13ba656ac6cb","Type":"ContainerStarted","Data":"1ab354af79cb9e6d33738f0ab8137c4fdbed5a4de1c27dffc2ee61ff22fe1b48"} Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.044292 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68c9f99d4d-r9tj5" event={"ID":"48b97e49-cad0-4482-ba00-13ba656ac6cb","Type":"ContainerStarted","Data":"120de600b08187c22afbc1b21e4e6de575b76f4f4941ef47cdbf10068a81d7cd"} Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.046372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" event={"ID":"ad60ac10-f6f3-46b3-93df-fbb0fa1dca43","Type":"ContainerDied","Data":"0219e3f6ea9e5f7f3544453d460290940faac1e7503420ed2e8f725cc1527a6a"} Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.046416 4743 scope.go:117] "RemoveContainer" containerID="2cc3a28c5a1bba28c6a1fcb245a3a9a5bc8469f4463eda608baaa10ade1d0448" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.046451 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.054242 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" (UID: "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.054621 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8fnbh" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.054740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8fnbh" event={"ID":"57d812f1-305a-40e9-b8ff-51b5e640ca57","Type":"ContainerDied","Data":"87b05d0a915b8d7697527c49a0ff228fd20a54c033066d1c067aa833d2e738fb"} Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.054891 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87b05d0a915b8d7697527c49a0ff228fd20a54c033066d1c067aa833d2e738fb" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.057330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"482b3103-f6d6-410f-9106-b10ad1695c78","Type":"ContainerStarted","Data":"84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5"} Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.086085 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" (UID: "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.087847 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-config" (OuterVolumeSpecName: "config") pod "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" (UID: "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.088412 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" (UID: "ad60ac10-f6f3-46b3-93df-fbb0fa1dca43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.103031 4743 scope.go:117] "RemoveContainer" containerID="69cdfd9df798610b35a6af430930d201b5ce3a778dcd0954f947237039177ee5" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.131015 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.131049 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.131060 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.131069 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.397055 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-z7zlr"] Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.425559 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-z7zlr"] Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.644885 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5bbb89db44-8df8j"] Mar 10 15:27:31 crc kubenswrapper[4743]: E0310 15:27:31.645863 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerName="init" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.645881 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerName="init" Mar 10 15:27:31 crc kubenswrapper[4743]: E0310 15:27:31.645898 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerName="dnsmasq-dns" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.645921 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerName="dnsmasq-dns" Mar 10 15:27:31 crc kubenswrapper[4743]: E0310 15:27:31.645936 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ff1eff-8355-40a9-b02d-cfb47e08bb46" containerName="barbican-db-sync" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.645943 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ff1eff-8355-40a9-b02d-cfb47e08bb46" containerName="barbican-db-sync" Mar 10 15:27:31 crc kubenswrapper[4743]: E0310 15:27:31.645973 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d812f1-305a-40e9-b8ff-51b5e640ca57" containerName="keystone-bootstrap" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.645979 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d812f1-305a-40e9-b8ff-51b5e640ca57" containerName="keystone-bootstrap" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.646157 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d812f1-305a-40e9-b8ff-51b5e640ca57" containerName="keystone-bootstrap" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.646182 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ff1eff-8355-40a9-b02d-cfb47e08bb46" containerName="barbican-db-sync" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.646199 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerName="dnsmasq-dns" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.646927 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.650373 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.650544 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.650619 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.650746 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wwgbc" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.651031 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.651316 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.688832 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bbb89db44-8df8j"] Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.753552 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-internal-tls-certs\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.753672 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-fernet-keys\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.753725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-public-tls-certs\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.753783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-credential-keys\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.753848 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-combined-ca-bundle\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.753934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-scripts\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.753990 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z62bj\" (UniqueName: \"kubernetes.io/projected/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-kube-api-access-z62bj\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.754028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-config-data\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.828189 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6fd79b999-7w7qq"] Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.829886 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.845798 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6c7d566dd6-n2r2w"] Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.848258 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.849595 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.850119 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.850251 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w2ql2" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.851255 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.858020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6fd79b999-7w7qq"] Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.859133 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z62bj\" (UniqueName: \"kubernetes.io/projected/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-kube-api-access-z62bj\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.859184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-config-data\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.859236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-internal-tls-certs\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.859284 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-fernet-keys\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.859305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-public-tls-certs\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.859335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-credential-keys\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.859362 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-combined-ca-bundle\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.859399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-scripts\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.872072 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c7d566dd6-n2r2w"] Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.877010 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-public-tls-certs\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.877594 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-fernet-keys\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.880208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-credential-keys\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.880643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-scripts\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.881185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-combined-ca-bundle\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.889700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-internal-tls-certs\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.901641 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-config-data\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.940614 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z62bj\" (UniqueName: \"kubernetes.io/projected/c440f8ea-e5b3-49ea-a981-cf68bac5a2e5-kube-api-access-z62bj\") pod \"keystone-5bbb89db44-8df8j\" (UID: \"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5\") " pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.961533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cff04bc-a0b8-4155-8407-4f8253faa9e3-logs\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.961932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-combined-ca-bundle\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.962044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data-custom\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.962213 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-combined-ca-bundle\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.962307 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data-custom\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.962431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sww8\" (UniqueName: \"kubernetes.io/projected/82ee9c60-c790-40a1-816a-4152f87c16e0-kube-api-access-2sww8\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.962566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82ee9c60-c790-40a1-816a-4152f87c16e0-logs\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.962677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8bf\" (UniqueName: \"kubernetes.io/projected/0cff04bc-a0b8-4155-8407-4f8253faa9e3-kube-api-access-5m8bf\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.962770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.962953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.976513 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:31 crc kubenswrapper[4743]: I0310 15:27:31.995707 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" path="/var/lib/kubelet/pods/ad60ac10-f6f3-46b3-93df-fbb0fa1dca43/volumes" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:31.998353 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vmpwv"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.005713 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.026199 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vmpwv"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.026742 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-z7zlr" podUID="ad60ac10-f6f3-46b3-93df-fbb0fa1dca43" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: i/o timeout" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cff04bc-a0b8-4155-8407-4f8253faa9e3-logs\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqvmm\" (UniqueName: \"kubernetes.io/projected/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-kube-api-access-gqvmm\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-combined-ca-bundle\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data-custom\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-combined-ca-bundle\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data-custom\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sww8\" (UniqueName: \"kubernetes.io/projected/82ee9c60-c790-40a1-816a-4152f87c16e0-kube-api-access-2sww8\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066663 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82ee9c60-c790-40a1-816a-4152f87c16e0-logs\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-config\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8bf\" (UniqueName: \"kubernetes.io/projected/0cff04bc-a0b8-4155-8407-4f8253faa9e3-kube-api-access-5m8bf\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.066759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.072351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data-custom\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.072969 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cff04bc-a0b8-4155-8407-4f8253faa9e3-logs\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.077299 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f79684784-rgzrv"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.078534 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82ee9c60-c790-40a1-816a-4152f87c16e0-logs\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.079040 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.085848 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.087422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data-custom\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.088418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-combined-ca-bundle\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.098033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.166585 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-combined-ca-bundle\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.166956 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.167136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8bf\" (UniqueName: \"kubernetes.io/projected/0cff04bc-a0b8-4155-8407-4f8253faa9e3-kube-api-access-5m8bf\") pod \"barbican-keystone-listener-6c7d566dd6-n2r2w\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.167205 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f79684784-rgzrv"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.171230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sww8\" (UniqueName: \"kubernetes.io/projected/82ee9c60-c790-40a1-816a-4152f87c16e0-kube-api-access-2sww8\") pod \"barbican-worker-6fd79b999-7w7qq\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.178450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqvmm\" (UniqueName: \"kubernetes.io/projected/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-kube-api-access-gqvmm\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.189639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.189770 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.190064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.190178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.190363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-config\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.191622 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.199362 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-config\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.200651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68c9f99d4d-r9tj5" event={"ID":"48b97e49-cad0-4482-ba00-13ba656ac6cb","Type":"ContainerStarted","Data":"ab8c8a1d1914ee17944093b828600d4e6fa46ebe86810d67bef6c56595fd288d"} Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.209484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.214582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.216631 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.220414 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.220650 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.241919 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7dbc4488d6-xzd4j"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.243181 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.232096 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqvmm\" (UniqueName: \"kubernetes.io/projected/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-kube-api-access-gqvmm\") pod \"dnsmasq-dns-848cf88cfc-vmpwv\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.249938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lgtjg" event={"ID":"45477f7e-f216-40fb-acdb-d7a1dbadba99","Type":"ContainerStarted","Data":"7469fa0f036b039a82ebd805a4b2cf98465a614c7a0da55c947089bbf1fc3e9b"} Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.250205 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.259306 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.272123 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-86d87df6b7-xvf5q"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.297981 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.327609 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qfq7\" (UniqueName: \"kubernetes.io/projected/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-kube-api-access-2qfq7\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.327700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-logs\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.327976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-combined-ca-bundle\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.328010 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data-custom\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.328119 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.378074 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435209 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-config-data-custom\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435264 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-combined-ca-bundle\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-config-data\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435353 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qfq7\" (UniqueName: \"kubernetes.io/projected/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-kube-api-access-2qfq7\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435393 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-logs\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435494 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-combined-ca-bundle\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435601 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e55638-f4f6-4dcb-944d-671e657f664e-logs\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435621 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-config-data\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qqkf\" (UniqueName: \"kubernetes.io/projected/05d63a20-7909-4ff1-8416-f55b448633fb-kube-api-access-7qqkf\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435680 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-combined-ca-bundle\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data-custom\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-config-data-custom\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435825 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05d63a20-7909-4ff1-8416-f55b448633fb-logs\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.435933 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxq6w\" (UniqueName: \"kubernetes.io/projected/15e55638-f4f6-4dcb-944d-671e657f664e-kube-api-access-xxq6w\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.436208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-logs\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.452846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data-custom\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.454708 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7dbc4488d6-xzd4j"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.456698 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-combined-ca-bundle\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.457197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.462532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qfq7\" (UniqueName: \"kubernetes.io/projected/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-kube-api-access-2qfq7\") pod \"barbican-api-6f79684784-rgzrv\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.467553 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-86d87df6b7-xvf5q"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.472100 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-68c9f99d4d-r9tj5" podStartSLOduration=10.472087791 podStartE2EDuration="10.472087791s" podCreationTimestamp="2026-03-10 15:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:32.267799225 +0000 UTC m=+1316.974613993" watchObservedRunningTime="2026-03-10 15:27:32.472087791 +0000 UTC m=+1317.178902539" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.491745 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-lgtjg" podStartSLOduration=5.427154423 podStartE2EDuration="52.491723877s" podCreationTimestamp="2026-03-10 15:26:40 +0000 UTC" firstStartedPulling="2026-03-10 15:26:43.17545417 +0000 UTC m=+1267.882268908" lastFinishedPulling="2026-03-10 15:27:30.240023614 +0000 UTC m=+1314.946838362" observedRunningTime="2026-03-10 15:27:32.302672943 +0000 UTC m=+1317.009487681" watchObservedRunningTime="2026-03-10 15:27:32.491723877 +0000 UTC m=+1317.198538615" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.509882 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b7b59b66d-df78r"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.511579 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.537982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxq6w\" (UniqueName: \"kubernetes.io/projected/15e55638-f4f6-4dcb-944d-671e657f664e-kube-api-access-xxq6w\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538027 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-config-data-custom\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-combined-ca-bundle\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-config-data\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538133 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-combined-ca-bundle\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538175 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e55638-f4f6-4dcb-944d-671e657f664e-logs\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-config-data\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538215 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qqkf\" (UniqueName: \"kubernetes.io/projected/05d63a20-7909-4ff1-8416-f55b448633fb-kube-api-access-7qqkf\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538249 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-config-data-custom\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538283 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05d63a20-7909-4ff1-8416-f55b448633fb-logs\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.538640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05d63a20-7909-4ff1-8416-f55b448633fb-logs\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.539571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e55638-f4f6-4dcb-944d-671e657f664e-logs\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.542399 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-config-data-custom\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.545104 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b7b59b66d-df78r"] Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.551774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-config-data\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.552699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-config-data-custom\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.555147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-combined-ca-bundle\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.559908 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e55638-f4f6-4dcb-944d-671e657f664e-config-data\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.560428 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d63a20-7909-4ff1-8416-f55b448633fb-combined-ca-bundle\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.570091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qqkf\" (UniqueName: \"kubernetes.io/projected/05d63a20-7909-4ff1-8416-f55b448633fb-kube-api-access-7qqkf\") pod \"barbican-worker-86d87df6b7-xvf5q\" (UID: \"05d63a20-7909-4ff1-8416-f55b448633fb\") " pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.572438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxq6w\" (UniqueName: \"kubernetes.io/projected/15e55638-f4f6-4dcb-944d-671e657f664e-kube-api-access-xxq6w\") pod \"barbican-keystone-listener-7dbc4488d6-xzd4j\" (UID: \"15e55638-f4f6-4dcb-944d-671e657f664e\") " pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.592616 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.642603 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data-custom\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.642671 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.642771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ntwf\" (UniqueName: \"kubernetes.io/projected/7f942785-954d-4441-ac29-69e7b65ead94-kube-api-access-9ntwf\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.642795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f942785-954d-4441-ac29-69e7b65ead94-logs\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.643067 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-combined-ca-bundle\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.721346 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.753755 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ntwf\" (UniqueName: \"kubernetes.io/projected/7f942785-954d-4441-ac29-69e7b65ead94-kube-api-access-9ntwf\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.753853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f942785-954d-4441-ac29-69e7b65ead94-logs\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.754042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-combined-ca-bundle\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.754187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data-custom\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.754233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.754865 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f942785-954d-4441-ac29-69e7b65ead94-logs\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.768080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-86d87df6b7-xvf5q" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.776541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ntwf\" (UniqueName: \"kubernetes.io/projected/7f942785-954d-4441-ac29-69e7b65ead94-kube-api-access-9ntwf\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.778185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.788803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-combined-ca-bundle\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.791947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data-custom\") pod \"barbican-api-6b7b59b66d-df78r\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:32 crc kubenswrapper[4743]: I0310 15:27:32.837521 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.363217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-86f6w" event={"ID":"68a99e3b-6d76-485c-b284-5f275ba9bbef","Type":"ContainerStarted","Data":"8ffbb79c1ba7553c23c09ecc5886e0f9abbe9773800fc7878bbf8fcde3f4c2de"} Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.384903 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c7d566dd6-n2r2w"] Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.433992 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vmpwv"] Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.470891 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bbb89db44-8df8j"] Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.481774 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-86f6w" podStartSLOduration=4.871045843 podStartE2EDuration="53.481749737s" podCreationTimestamp="2026-03-10 15:26:40 +0000 UTC" firstStartedPulling="2026-03-10 15:26:42.788250773 +0000 UTC m=+1267.495065521" lastFinishedPulling="2026-03-10 15:27:31.398954667 +0000 UTC m=+1316.105769415" observedRunningTime="2026-03-10 15:27:33.414096781 +0000 UTC m=+1318.120911529" watchObservedRunningTime="2026-03-10 15:27:33.481749737 +0000 UTC m=+1318.188564495" Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.659336 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6fd79b999-7w7qq"] Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.687648 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-86d87df6b7-xvf5q"] Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.744311 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f79684784-rgzrv"] Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.752976 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7dbc4488d6-xzd4j"] Mar 10 15:27:33 crc kubenswrapper[4743]: W0310 15:27:33.760849 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ee9c60_c790_40a1_816a_4152f87c16e0.slice/crio-1699856c3ee2daa565d2439c06b44178988b10e25e4b421a21a662364641fada WatchSource:0}: Error finding container 1699856c3ee2daa565d2439c06b44178988b10e25e4b421a21a662364641fada: Status 404 returned error can't find the container with id 1699856c3ee2daa565d2439c06b44178988b10e25e4b421a21a662364641fada Mar 10 15:27:33 crc kubenswrapper[4743]: I0310 15:27:33.951350 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b7b59b66d-df78r"] Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.386748 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fd79b999-7w7qq" event={"ID":"82ee9c60-c790-40a1-816a-4152f87c16e0","Type":"ContainerStarted","Data":"1699856c3ee2daa565d2439c06b44178988b10e25e4b421a21a662364641fada"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.403694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" event={"ID":"0cff04bc-a0b8-4155-8407-4f8253faa9e3","Type":"ContainerStarted","Data":"a2523c50d60a3e00a86ffee03ac7e3ab470e861d5fe67bca9f602be42708e8df"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.411889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bbb89db44-8df8j" event={"ID":"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5","Type":"ContainerStarted","Data":"f8ca204add69e39bdd8f6a1fa914b2791bf02928dab9024e1a661442efd2378f"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.411966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bbb89db44-8df8j" event={"ID":"c440f8ea-e5b3-49ea-a981-cf68bac5a2e5","Type":"ContainerStarted","Data":"52edd401617a4a0609e2e87d7d11eb2976fda95447de96ee798495e0682c3805"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.412358 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.414340 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f79684784-rgzrv" event={"ID":"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9","Type":"ContainerStarted","Data":"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.414373 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f79684784-rgzrv" event={"ID":"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9","Type":"ContainerStarted","Data":"adf4a2d964ccc56ef771f6302a0f67e471f6a714962f11a3f9d70a8aa5217518"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.416686 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86d87df6b7-xvf5q" event={"ID":"05d63a20-7909-4ff1-8416-f55b448633fb","Type":"ContainerStarted","Data":"5825ccf0ae0e2b2fed61c1cce2e418f77f7a778fb8bbfb9eeb3d5fa28c9125c6"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.418263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" event={"ID":"15e55638-f4f6-4dcb-944d-671e657f664e","Type":"ContainerStarted","Data":"0786dda52d3d63e3bed547ea27ad7b307fc03f33a83c1b0462f5582929f83334"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.428699 4743 generic.go:334] "Generic (PLEG): container finished" podID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" containerID="026b45c8e11d839a6f4981e884df8bf077362712339e5f1f160cbd9c38fba3e2" exitCode=0 Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.429408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" event={"ID":"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6","Type":"ContainerDied","Data":"026b45c8e11d839a6f4981e884df8bf077362712339e5f1f160cbd9c38fba3e2"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.429487 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" event={"ID":"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6","Type":"ContainerStarted","Data":"6c0644b1b9d0286f00e3cbd70e7dcaffd55fa590bdb16cd75ac58d627bd6fa6f"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.440510 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b59b66d-df78r" event={"ID":"7f942785-954d-4441-ac29-69e7b65ead94","Type":"ContainerStarted","Data":"4d7f02810547df8b25aca23259a587c4cf1b119718eb9e12f206d49a9ff93e02"} Mar 10 15:27:34 crc kubenswrapper[4743]: I0310 15:27:34.557996 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5bbb89db44-8df8j" podStartSLOduration=3.557968618 podStartE2EDuration="3.557968618s" podCreationTimestamp="2026-03-10 15:27:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:34.547517082 +0000 UTC m=+1319.254331830" watchObservedRunningTime="2026-03-10 15:27:34.557968618 +0000 UTC m=+1319.264783366" Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.508643 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" event={"ID":"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6","Type":"ContainerStarted","Data":"e0d52e7ff40eba6b44a945eb6d4ac6091b5525650d674c73187ec2dab5c70c36"} Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.510321 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.524774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b59b66d-df78r" event={"ID":"7f942785-954d-4441-ac29-69e7b65ead94","Type":"ContainerStarted","Data":"d6fdde5ec4adfd5b915d588ff90fa45e205b51316b675665145cfb4b6acbf4b0"} Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.524849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b59b66d-df78r" event={"ID":"7f942785-954d-4441-ac29-69e7b65ead94","Type":"ContainerStarted","Data":"a44b0ab62e898d54d43645d8c85eb8aaa2e9faaa6c8a1ebe1d196220ca7d6e80"} Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.525258 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.525313 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.546286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f79684784-rgzrv" event={"ID":"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9","Type":"ContainerStarted","Data":"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0"} Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.546338 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.546361 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.550537 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" podStartSLOduration=4.550511278 podStartE2EDuration="4.550511278s" podCreationTimestamp="2026-03-10 15:27:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:35.535357949 +0000 UTC m=+1320.242172717" watchObservedRunningTime="2026-03-10 15:27:35.550511278 +0000 UTC m=+1320.257326036" Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.564942 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b7b59b66d-df78r" podStartSLOduration=3.564924716 podStartE2EDuration="3.564924716s" podCreationTimestamp="2026-03-10 15:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:35.563256949 +0000 UTC m=+1320.270071687" watchObservedRunningTime="2026-03-10 15:27:35.564924716 +0000 UTC m=+1320.271739464" Mar 10 15:27:35 crc kubenswrapper[4743]: I0310 15:27:35.593293 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f79684784-rgzrv" podStartSLOduration=3.593276259 podStartE2EDuration="3.593276259s" podCreationTimestamp="2026-03-10 15:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:35.593109585 +0000 UTC m=+1320.299924343" watchObservedRunningTime="2026-03-10 15:27:35.593276259 +0000 UTC m=+1320.300091007" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.207511 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f79684784-rgzrv"] Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.255865 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-85ffb9c4dd-pf4mm"] Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.257509 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.266149 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.266372 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.273171 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85ffb9c4dd-pf4mm"] Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.328765 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-combined-ca-bundle\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.328853 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-config-data-custom\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.328892 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-logs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.328933 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-config-data\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.328993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ngk6\" (UniqueName: \"kubernetes.io/projected/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-kube-api-access-9ngk6\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.329025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-public-tls-certs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.329087 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-internal-tls-certs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.430670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-config-data-custom\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.430757 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-logs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.430803 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-config-data\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.430939 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ngk6\" (UniqueName: \"kubernetes.io/projected/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-kube-api-access-9ngk6\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.430977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-public-tls-certs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.431038 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-internal-tls-certs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.431114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-combined-ca-bundle\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.453862 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-logs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.453967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-config-data-custom\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.465602 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-combined-ca-bundle\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.465874 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ngk6\" (UniqueName: \"kubernetes.io/projected/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-kube-api-access-9ngk6\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.477328 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-config-data\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.479739 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-internal-tls-certs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.482868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98f89f9-bf8d-4a56-868d-dba8bf4c56e7-public-tls-certs\") pod \"barbican-api-85ffb9c4dd-pf4mm\" (UID: \"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7\") " pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:36 crc kubenswrapper[4743]: I0310 15:27:36.606547 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:37 crc kubenswrapper[4743]: I0310 15:27:37.595982 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f79684784-rgzrv" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerName="barbican-api-log" containerID="cri-o://4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac" gracePeriod=30 Mar 10 15:27:37 crc kubenswrapper[4743]: I0310 15:27:37.596021 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f79684784-rgzrv" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerName="barbican-api" containerID="cri-o://a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0" gracePeriod=30 Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.311957 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85ffb9c4dd-pf4mm"] Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.481777 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.584717 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qfq7\" (UniqueName: \"kubernetes.io/projected/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-kube-api-access-2qfq7\") pod \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.584839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data-custom\") pod \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.584877 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-combined-ca-bundle\") pod \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.584916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data\") pod \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.585007 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-logs\") pod \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\" (UID: \"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9\") " Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.585847 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-logs" (OuterVolumeSpecName: "logs") pod "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" (UID: "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.595663 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" (UID: "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.598000 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-kube-api-access-2qfq7" (OuterVolumeSpecName: "kube-api-access-2qfq7") pod "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" (UID: "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9"). InnerVolumeSpecName "kube-api-access-2qfq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.622803 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" event={"ID":"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7","Type":"ContainerStarted","Data":"4727dac38236d820a3e726a38da4a5168441e7d51e755967a95c932b02c0871e"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.646120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fd79b999-7w7qq" event={"ID":"82ee9c60-c790-40a1-816a-4152f87c16e0","Type":"ContainerStarted","Data":"8b872ebecc6a2a29bc444153e8f498acea78073dee93b1d2f2eeb719777ad36b"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.646471 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fd79b999-7w7qq" event={"ID":"82ee9c60-c790-40a1-816a-4152f87c16e0","Type":"ContainerStarted","Data":"2a095a8acf8c8cc66bb6e09146ffbd5e9facaf71496efad0a082b57c3af62541"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.666230 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" event={"ID":"0cff04bc-a0b8-4155-8407-4f8253faa9e3","Type":"ContainerStarted","Data":"ce60b74ef61fbf223c1bc21dab3d5d62cd5f5056838b8d940cf9e51fcdd94d94"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.666291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" event={"ID":"0cff04bc-a0b8-4155-8407-4f8253faa9e3","Type":"ContainerStarted","Data":"2eec530f630fcd1bb36338a94574510858c4a559aa4bb1813f6b72cc91a043a3"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.669015 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6fd79b999-7w7qq" podStartSLOduration=3.670432563 podStartE2EDuration="7.66899477s" podCreationTimestamp="2026-03-10 15:27:31 +0000 UTC" firstStartedPulling="2026-03-10 15:27:33.791031077 +0000 UTC m=+1318.497845825" lastFinishedPulling="2026-03-10 15:27:37.789593284 +0000 UTC m=+1322.496408032" observedRunningTime="2026-03-10 15:27:38.668318761 +0000 UTC m=+1323.375133529" watchObservedRunningTime="2026-03-10 15:27:38.66899477 +0000 UTC m=+1323.375809518" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.675181 4743 generic.go:334] "Generic (PLEG): container finished" podID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerID="a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0" exitCode=0 Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.675251 4743 generic.go:334] "Generic (PLEG): container finished" podID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerID="4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac" exitCode=143 Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.675482 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f79684784-rgzrv" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.676307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f79684784-rgzrv" event={"ID":"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9","Type":"ContainerDied","Data":"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.676362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f79684784-rgzrv" event={"ID":"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9","Type":"ContainerDied","Data":"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.676372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f79684784-rgzrv" event={"ID":"4e9bcd21-9034-46e7-a0ac-b2da56fa37f9","Type":"ContainerDied","Data":"adf4a2d964ccc56ef771f6302a0f67e471f6a714962f11a3f9d70a8aa5217518"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.676393 4743 scope.go:117] "RemoveContainer" containerID="a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.688106 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qfq7\" (UniqueName: \"kubernetes.io/projected/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-kube-api-access-2qfq7\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.688146 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.688159 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.695279 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86d87df6b7-xvf5q" event={"ID":"05d63a20-7909-4ff1-8416-f55b448633fb","Type":"ContainerStarted","Data":"b2bb973844d3c390e15ec46d8f534b772e3b79062caa5f2d8b0b33183961da8e"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.722497 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" podStartSLOduration=3.37272847 podStartE2EDuration="7.722473544s" podCreationTimestamp="2026-03-10 15:27:31 +0000 UTC" firstStartedPulling="2026-03-10 15:27:33.440028885 +0000 UTC m=+1318.146843633" lastFinishedPulling="2026-03-10 15:27:37.789773959 +0000 UTC m=+1322.496588707" observedRunningTime="2026-03-10 15:27:38.688369988 +0000 UTC m=+1323.395184736" watchObservedRunningTime="2026-03-10 15:27:38.722473544 +0000 UTC m=+1323.429288292" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.726114 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" event={"ID":"15e55638-f4f6-4dcb-944d-671e657f664e","Type":"ContainerStarted","Data":"6af153237f5a02b9c63d164f98a69cc44803a922dafc8a9cab59dc3cfc19a684"} Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.726925 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" (UID: "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.739279 4743 scope.go:117] "RemoveContainer" containerID="4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.789998 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.794789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data" (OuterVolumeSpecName: "config-data") pod "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" (UID: "4e9bcd21-9034-46e7-a0ac-b2da56fa37f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.893279 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.894271 4743 scope.go:117] "RemoveContainer" containerID="a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0" Mar 10 15:27:38 crc kubenswrapper[4743]: E0310 15:27:38.896212 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0\": container with ID starting with a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0 not found: ID does not exist" containerID="a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.896267 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0"} err="failed to get container status \"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0\": rpc error: code = NotFound desc = could not find container \"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0\": container with ID starting with a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0 not found: ID does not exist" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.896296 4743 scope.go:117] "RemoveContainer" containerID="4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac" Mar 10 15:27:38 crc kubenswrapper[4743]: E0310 15:27:38.897648 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac\": container with ID starting with 4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac not found: ID does not exist" containerID="4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.897710 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac"} err="failed to get container status \"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac\": rpc error: code = NotFound desc = could not find container \"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac\": container with ID starting with 4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac not found: ID does not exist" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.897738 4743 scope.go:117] "RemoveContainer" containerID="a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.898257 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0"} err="failed to get container status \"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0\": rpc error: code = NotFound desc = could not find container \"a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0\": container with ID starting with a015b54293ecb00ec354c0a3c4fccde5bfd63112383c5a0007562b092afda1c0 not found: ID does not exist" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.898300 4743 scope.go:117] "RemoveContainer" containerID="4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac" Mar 10 15:27:38 crc kubenswrapper[4743]: I0310 15:27:38.898601 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac"} err="failed to get container status \"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac\": rpc error: code = NotFound desc = could not find container \"4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac\": container with ID starting with 4484e17cbaf4ac8af12192fe13d6621906a699f32c27a965ceaa6cc408f386ac not found: ID does not exist" Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.074782 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f79684784-rgzrv"] Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.086676 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f79684784-rgzrv"] Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.740011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86d87df6b7-xvf5q" event={"ID":"05d63a20-7909-4ff1-8416-f55b448633fb","Type":"ContainerStarted","Data":"5be8262f81e3e88d51e846412ceb1d1ee604502bc7029e41b57fa066fe476194"} Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.746777 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" event={"ID":"15e55638-f4f6-4dcb-944d-671e657f664e","Type":"ContainerStarted","Data":"f4f8f6a24f4885c9df8945d8879c0fd5e28f22b70add4df42217986ce4ad49fd"} Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.750666 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" event={"ID":"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7","Type":"ContainerStarted","Data":"59bc04b9b2267ba56312f91575aaa3976ff9843104a9a14e39f87cb93c10b085"} Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.750715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" event={"ID":"a98f89f9-bf8d-4a56-868d-dba8bf4c56e7","Type":"ContainerStarted","Data":"4a4eeddf010eaf0c50438ecb6606331f80c8dc1e50884b1b00af583e93733fcd"} Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.804514 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-86d87df6b7-xvf5q" podStartSLOduration=3.724957019 podStartE2EDuration="7.80449485s" podCreationTimestamp="2026-03-10 15:27:32 +0000 UTC" firstStartedPulling="2026-03-10 15:27:33.710459765 +0000 UTC m=+1318.417274523" lastFinishedPulling="2026-03-10 15:27:37.789997606 +0000 UTC m=+1322.496812354" observedRunningTime="2026-03-10 15:27:39.765934088 +0000 UTC m=+1324.472748836" watchObservedRunningTime="2026-03-10 15:27:39.80449485 +0000 UTC m=+1324.511309598" Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.808793 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6fd79b999-7w7qq"] Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.818830 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" podStartSLOduration=3.818795955 podStartE2EDuration="3.818795955s" podCreationTimestamp="2026-03-10 15:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:39.786187841 +0000 UTC m=+1324.493002589" watchObservedRunningTime="2026-03-10 15:27:39.818795955 +0000 UTC m=+1324.525610703" Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.846502 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7dbc4488d6-xzd4j" podStartSLOduration=3.826212567 podStartE2EDuration="7.846475389s" podCreationTimestamp="2026-03-10 15:27:32 +0000 UTC" firstStartedPulling="2026-03-10 15:27:33.789779461 +0000 UTC m=+1318.496594219" lastFinishedPulling="2026-03-10 15:27:37.810042303 +0000 UTC m=+1322.516857041" observedRunningTime="2026-03-10 15:27:39.809018948 +0000 UTC m=+1324.515833696" watchObservedRunningTime="2026-03-10 15:27:39.846475389 +0000 UTC m=+1324.553290127" Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.862183 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6c7d566dd6-n2r2w"] Mar 10 15:27:39 crc kubenswrapper[4743]: I0310 15:27:39.929444 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" path="/var/lib/kubelet/pods/4e9bcd21-9034-46e7-a0ac-b2da56fa37f9/volumes" Mar 10 15:27:40 crc kubenswrapper[4743]: I0310 15:27:40.748064 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 10 15:27:40 crc kubenswrapper[4743]: I0310 15:27:40.767694 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:40 crc kubenswrapper[4743]: I0310 15:27:40.767741 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:40 crc kubenswrapper[4743]: I0310 15:27:40.767765 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6fd79b999-7w7qq" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerName="barbican-worker" containerID="cri-o://8b872ebecc6a2a29bc444153e8f498acea78073dee93b1d2f2eeb719777ad36b" gracePeriod=30 Mar 10 15:27:40 crc kubenswrapper[4743]: I0310 15:27:40.768123 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerName="barbican-keystone-listener" containerID="cri-o://ce60b74ef61fbf223c1bc21dab3d5d62cd5f5056838b8d940cf9e51fcdd94d94" gracePeriod=30 Mar 10 15:27:40 crc kubenswrapper[4743]: I0310 15:27:40.768082 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerName="barbican-keystone-listener-log" containerID="cri-o://2eec530f630fcd1bb36338a94574510858c4a559aa4bb1813f6b72cc91a043a3" gracePeriod=30 Mar 10 15:27:40 crc kubenswrapper[4743]: I0310 15:27:40.768348 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6fd79b999-7w7qq" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerName="barbican-worker-log" containerID="cri-o://2a095a8acf8c8cc66bb6e09146ffbd5e9facaf71496efad0a082b57c3af62541" gracePeriod=30 Mar 10 15:27:40 crc kubenswrapper[4743]: I0310 15:27:40.830330 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-958fd895b-mxn2t" podUID="cccf05c8-d4e8-4a1d-912f-5f4a37440ac7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 10 15:27:41 crc kubenswrapper[4743]: I0310 15:27:41.841386 4743 generic.go:334] "Generic (PLEG): container finished" podID="68a99e3b-6d76-485c-b284-5f275ba9bbef" containerID="8ffbb79c1ba7553c23c09ecc5886e0f9abbe9773800fc7878bbf8fcde3f4c2de" exitCode=0 Mar 10 15:27:41 crc kubenswrapper[4743]: I0310 15:27:41.841487 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-86f6w" event={"ID":"68a99e3b-6d76-485c-b284-5f275ba9bbef","Type":"ContainerDied","Data":"8ffbb79c1ba7553c23c09ecc5886e0f9abbe9773800fc7878bbf8fcde3f4c2de"} Mar 10 15:27:41 crc kubenswrapper[4743]: I0310 15:27:41.876885 4743 generic.go:334] "Generic (PLEG): container finished" podID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerID="2a095a8acf8c8cc66bb6e09146ffbd5e9facaf71496efad0a082b57c3af62541" exitCode=143 Mar 10 15:27:41 crc kubenswrapper[4743]: I0310 15:27:41.877094 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fd79b999-7w7qq" event={"ID":"82ee9c60-c790-40a1-816a-4152f87c16e0","Type":"ContainerDied","Data":"2a095a8acf8c8cc66bb6e09146ffbd5e9facaf71496efad0a082b57c3af62541"} Mar 10 15:27:41 crc kubenswrapper[4743]: I0310 15:27:41.881220 4743 generic.go:334] "Generic (PLEG): container finished" podID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerID="2eec530f630fcd1bb36338a94574510858c4a559aa4bb1813f6b72cc91a043a3" exitCode=143 Mar 10 15:27:41 crc kubenswrapper[4743]: I0310 15:27:41.881865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" event={"ID":"0cff04bc-a0b8-4155-8407-4f8253faa9e3","Type":"ContainerDied","Data":"2eec530f630fcd1bb36338a94574510858c4a559aa4bb1813f6b72cc91a043a3"} Mar 10 15:27:42 crc kubenswrapper[4743]: I0310 15:27:42.381188 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:27:42 crc kubenswrapper[4743]: I0310 15:27:42.483958 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-jzmg2"] Mar 10 15:27:42 crc kubenswrapper[4743]: I0310 15:27:42.484372 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerName="dnsmasq-dns" containerID="cri-o://6776dd2441f62d1ab751bcf7bc9237313b93fa8dd5929cf0e5f7477e3ae97d04" gracePeriod=10 Mar 10 15:27:42 crc kubenswrapper[4743]: I0310 15:27:42.904687 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerID="6776dd2441f62d1ab751bcf7bc9237313b93fa8dd5929cf0e5f7477e3ae97d04" exitCode=0 Mar 10 15:27:42 crc kubenswrapper[4743]: I0310 15:27:42.905056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" event={"ID":"1d51c7ac-111d-46e8-903f-01f29e4221ac","Type":"ContainerDied","Data":"6776dd2441f62d1ab751bcf7bc9237313b93fa8dd5929cf0e5f7477e3ae97d04"} Mar 10 15:27:43 crc kubenswrapper[4743]: I0310 15:27:43.924377 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: connect: connection refused" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.157730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.418220 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5855c85b77-4c45c"] Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.419372 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5855c85b77-4c45c" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-httpd" containerID="cri-o://0a0d1f7f3393d7f31400370f146d4299b8f97dde9d368b1123819a2d7c31d935" gracePeriod=30 Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.420906 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5855c85b77-4c45c" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-api" containerID="cri-o://38cb9d98a25a82ad345c96bfd981bf7e7b4553c04e203f99a2a717bb6314dd7d" gracePeriod=30 Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.463119 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56f8646897-kmnvw"] Mar 10 15:27:44 crc kubenswrapper[4743]: E0310 15:27:44.463573 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerName="barbican-api-log" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.463584 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerName="barbican-api-log" Mar 10 15:27:44 crc kubenswrapper[4743]: E0310 15:27:44.463596 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerName="barbican-api" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.463601 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerName="barbican-api" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.463785 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerName="barbican-api-log" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.463802 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9bcd21-9034-46e7-a0ac-b2da56fa37f9" containerName="barbican-api" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.464743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.491695 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f8646897-kmnvw"] Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.524409 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-public-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.524470 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74njv\" (UniqueName: \"kubernetes.io/projected/7513fc87-13e7-4273-98eb-fda8dd8d0305-kube-api-access-74njv\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.524600 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-combined-ca-bundle\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.524753 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-httpd-config\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.524856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-internal-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.524910 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-config\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.524965 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-ovndb-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.539940 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5855c85b77-4c45c" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.163:9696/\": read tcp 10.217.0.2:48660->10.217.0.163:9696: read: connection reset by peer" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.626422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-internal-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.627259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-config\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.627314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-ovndb-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.627448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-public-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.627480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74njv\" (UniqueName: \"kubernetes.io/projected/7513fc87-13e7-4273-98eb-fda8dd8d0305-kube-api-access-74njv\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.627557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-combined-ca-bundle\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.627579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-httpd-config\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.632568 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-internal-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.633095 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-httpd-config\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.638025 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-public-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.638735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-combined-ca-bundle\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.650184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-ovndb-tls-certs\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.652837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74njv\" (UniqueName: \"kubernetes.io/projected/7513fc87-13e7-4273-98eb-fda8dd8d0305-kube-api-access-74njv\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.664924 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7513fc87-13e7-4273-98eb-fda8dd8d0305-config\") pod \"neutron-56f8646897-kmnvw\" (UID: \"7513fc87-13e7-4273-98eb-fda8dd8d0305\") " pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.818875 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.947714 4743 generic.go:334] "Generic (PLEG): container finished" podID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerID="0a0d1f7f3393d7f31400370f146d4299b8f97dde9d368b1123819a2d7c31d935" exitCode=0 Mar 10 15:27:44 crc kubenswrapper[4743]: I0310 15:27:44.948620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5855c85b77-4c45c" event={"ID":"94f36e37-4d73-48c9-a64b-810a95ed7bac","Type":"ContainerDied","Data":"0a0d1f7f3393d7f31400370f146d4299b8f97dde9d368b1123819a2d7c31d935"} Mar 10 15:27:45 crc kubenswrapper[4743]: I0310 15:27:45.612096 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:45 crc kubenswrapper[4743]: I0310 15:27:45.856804 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:46 crc kubenswrapper[4743]: I0310 15:27:46.431274 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5855c85b77-4c45c" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.163:9696/\": dial tcp 10.217.0.163:9696: connect: connection refused" Mar 10 15:27:47 crc kubenswrapper[4743]: I0310 15:27:47.980724 4743 generic.go:334] "Generic (PLEG): container finished" podID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerID="68119a62955bb4be3b995ec2294bc6a9da4607d114cadc47e5a26edfc51ddbb3" exitCode=137 Mar 10 15:27:47 crc kubenswrapper[4743]: I0310 15:27:47.981107 4743 generic.go:334] "Generic (PLEG): container finished" podID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerID="3690d14ef39973676eb199ac0c743ebaacea85a3090a411e71de0ba1a770ca3e" exitCode=137 Mar 10 15:27:47 crc kubenswrapper[4743]: I0310 15:27:47.980775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c58bbcd67-dxpcc" event={"ID":"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b","Type":"ContainerDied","Data":"68119a62955bb4be3b995ec2294bc6a9da4607d114cadc47e5a26edfc51ddbb3"} Mar 10 15:27:47 crc kubenswrapper[4743]: I0310 15:27:47.981158 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c58bbcd67-dxpcc" event={"ID":"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b","Type":"ContainerDied","Data":"3690d14ef39973676eb199ac0c743ebaacea85a3090a411e71de0ba1a770ca3e"} Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.277475 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.444153 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85ffb9c4dd-pf4mm" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.645236 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-86f6w" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.648475 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b7b59b66d-df78r"] Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.659726 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b7b59b66d-df78r" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api" containerID="cri-o://d6fdde5ec4adfd5b915d588ff90fa45e205b51316b675665145cfb4b6acbf4b0" gracePeriod=30 Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.648774 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b7b59b66d-df78r" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api-log" containerID="cri-o://a44b0ab62e898d54d43645d8c85eb8aaa2e9faaa6c8a1ebe1d196220ca7d6e80" gracePeriod=30 Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.665461 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7b59b66d-df78r" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": EOF" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.759999 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-scripts\") pod \"68a99e3b-6d76-485c-b284-5f275ba9bbef\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.760065 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-config-data\") pod \"68a99e3b-6d76-485c-b284-5f275ba9bbef\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.760093 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-db-sync-config-data\") pod \"68a99e3b-6d76-485c-b284-5f275ba9bbef\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.760124 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-combined-ca-bundle\") pod \"68a99e3b-6d76-485c-b284-5f275ba9bbef\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.760158 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd5rg\" (UniqueName: \"kubernetes.io/projected/68a99e3b-6d76-485c-b284-5f275ba9bbef-kube-api-access-dd5rg\") pod \"68a99e3b-6d76-485c-b284-5f275ba9bbef\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.760294 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68a99e3b-6d76-485c-b284-5f275ba9bbef-etc-machine-id\") pod \"68a99e3b-6d76-485c-b284-5f275ba9bbef\" (UID: \"68a99e3b-6d76-485c-b284-5f275ba9bbef\") " Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.760855 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68a99e3b-6d76-485c-b284-5f275ba9bbef-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "68a99e3b-6d76-485c-b284-5f275ba9bbef" (UID: "68a99e3b-6d76-485c-b284-5f275ba9bbef"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.795570 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "68a99e3b-6d76-485c-b284-5f275ba9bbef" (UID: "68a99e3b-6d76-485c-b284-5f275ba9bbef"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.796691 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a99e3b-6d76-485c-b284-5f275ba9bbef-kube-api-access-dd5rg" (OuterVolumeSpecName: "kube-api-access-dd5rg") pod "68a99e3b-6d76-485c-b284-5f275ba9bbef" (UID: "68a99e3b-6d76-485c-b284-5f275ba9bbef"). InnerVolumeSpecName "kube-api-access-dd5rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.797298 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-scripts" (OuterVolumeSpecName: "scripts") pod "68a99e3b-6d76-485c-b284-5f275ba9bbef" (UID: "68a99e3b-6d76-485c-b284-5f275ba9bbef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.860989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68a99e3b-6d76-485c-b284-5f275ba9bbef" (UID: "68a99e3b-6d76-485c-b284-5f275ba9bbef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.863027 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.863051 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.863065 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.863076 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd5rg\" (UniqueName: \"kubernetes.io/projected/68a99e3b-6d76-485c-b284-5f275ba9bbef-kube-api-access-dd5rg\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:48 crc kubenswrapper[4743]: I0310 15:27:48.863088 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68a99e3b-6d76-485c-b284-5f275ba9bbef-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.002996 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-config-data" (OuterVolumeSpecName: "config-data") pod "68a99e3b-6d76-485c-b284-5f275ba9bbef" (UID: "68a99e3b-6d76-485c-b284-5f275ba9bbef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.075140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-86f6w" event={"ID":"68a99e3b-6d76-485c-b284-5f275ba9bbef","Type":"ContainerDied","Data":"cef2e7c147540f7b19254f5416cdd113250b631982a8a3fab85e9d38670a1a08"} Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.075182 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cef2e7c147540f7b19254f5416cdd113250b631982a8a3fab85e9d38670a1a08" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.075250 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-86f6w" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.084147 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a99e3b-6d76-485c-b284-5f275ba9bbef-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.103142 4743 generic.go:334] "Generic (PLEG): container finished" podID="7f942785-954d-4441-ac29-69e7b65ead94" containerID="a44b0ab62e898d54d43645d8c85eb8aaa2e9faaa6c8a1ebe1d196220ca7d6e80" exitCode=143 Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.104261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b59b66d-df78r" event={"ID":"7f942785-954d-4441-ac29-69e7b65ead94","Type":"ContainerDied","Data":"a44b0ab62e898d54d43645d8c85eb8aaa2e9faaa6c8a1ebe1d196220ca7d6e80"} Mar 10 15:27:49 crc kubenswrapper[4743]: E0310 15:27:49.839588 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Mar 10 15:27:49 crc kubenswrapper[4743]: E0310 15:27:49.840032 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hbmzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(482b3103-f6d6-410f-9106-b10ad1695c78): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:27:49 crc kubenswrapper[4743]: E0310 15:27:49.841487 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.957339 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:49 crc kubenswrapper[4743]: E0310 15:27:49.957851 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a99e3b-6d76-485c-b284-5f275ba9bbef" containerName="cinder-db-sync" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.957866 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a99e3b-6d76-485c-b284-5f275ba9bbef" containerName="cinder-db-sync" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.958079 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a99e3b-6d76-485c-b284-5f275ba9bbef" containerName="cinder-db-sync" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.959068 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.960921 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.962113 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mcr6l" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.962335 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.962577 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 15:27:49 crc kubenswrapper[4743]: I0310 15:27:49.982431 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.005171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.005269 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/943d6458-faf7-4ed9-b883-51bfae20d07e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.005296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhf8w\" (UniqueName: \"kubernetes.io/projected/943d6458-faf7-4ed9-b883-51bfae20d07e-kube-api-access-lhf8w\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.005331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.005357 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.005433 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-scripts\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.031966 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-76whq"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.034556 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.078571 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.080537 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.090030 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.107560 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.107651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-scripts\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.107700 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.109503 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/943d6458-faf7-4ed9-b883-51bfae20d07e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.109854 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhf8w\" (UniqueName: \"kubernetes.io/projected/943d6458-faf7-4ed9-b883-51bfae20d07e-kube-api-access-lhf8w\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.109971 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.110214 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/943d6458-faf7-4ed9-b883-51bfae20d07e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.115407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-scripts\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.115636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.124515 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-76whq"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.125455 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.164449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.171002 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.172417 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.175639 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.176060 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhf8w\" (UniqueName: \"kubernetes.io/projected/943d6458-faf7-4ed9-b883-51bfae20d07e-kube-api-access-lhf8w\") pod \"cinder-scheduler-0\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: E0310 15:27:50.187955 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerName="dnsmasq-dns" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.191339 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerName="dnsmasq-dns" Mar 10 15:27:50 crc kubenswrapper[4743]: E0310 15:27:50.191429 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerName="horizon" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.191498 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerName="horizon" Mar 10 15:27:50 crc kubenswrapper[4743]: E0310 15:27:50.192016 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerName="init" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.192089 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerName="init" Mar 10 15:27:50 crc kubenswrapper[4743]: E0310 15:27:50.192163 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerName="horizon-log" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.192221 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerName="horizon-log" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.192525 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerName="horizon" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.192610 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" containerName="horizon-log" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.192690 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerName="dnsmasq-dns" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.212799 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgz56\" (UniqueName: \"kubernetes.io/projected/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-kube-api-access-vgz56\") pod \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.212889 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-scripts\") pod \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.212934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-sb\") pod \"1d51c7ac-111d-46e8-903f-01f29e4221ac\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.212955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-config-data\") pod \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.212972 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-nb\") pod \"1d51c7ac-111d-46e8-903f-01f29e4221ac\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213136 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213156 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213255 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-sys\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmhr\" (UniqueName: \"kubernetes.io/projected/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-kube-api-access-jwmhr\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-dev\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213361 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-lib-modules\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213403 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-scripts\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-run\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-ceph\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213501 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213574 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjwd\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-kube-api-access-prjwd\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213602 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-svc\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.213622 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-config\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.216871 4743 generic.go:334] "Generic (PLEG): container finished" podID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerID="38cb9d98a25a82ad345c96bfd981bf7e7b4553c04e203f99a2a717bb6314dd7d" exitCode=0 Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.221436 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.221794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5855c85b77-4c45c" event={"ID":"94f36e37-4d73-48c9-a64b-810a95ed7bac","Type":"ContainerDied","Data":"38cb9d98a25a82ad345c96bfd981bf7e7b4553c04e203f99a2a717bb6314dd7d"} Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.222617 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.237625 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-kube-api-access-vgz56" (OuterVolumeSpecName: "kube-api-access-vgz56") pod "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" (UID: "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b"). InnerVolumeSpecName "kube-api-access-vgz56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.239202 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.268738 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-scripts" (OuterVolumeSpecName: "scripts") pod "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" (UID: "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.275525 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.302914 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.303956 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" event={"ID":"1d51c7ac-111d-46e8-903f-01f29e4221ac","Type":"ContainerDied","Data":"09c37282c4b1299bfe0ff3c4a9b8f7346ea19cf3073fb3c36c9712c02b64ee2d"} Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.303995 4743 scope.go:117] "RemoveContainer" containerID="6776dd2441f62d1ab751bcf7bc9237313b93fa8dd5929cf0e5f7477e3ae97d04" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.307354 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-config-data" (OuterVolumeSpecName: "config-data") pod "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" (UID: "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.313163 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1d51c7ac-111d-46e8-903f-01f29e4221ac" (UID: "1d51c7ac-111d-46e8-903f-01f29e4221ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.315768 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcsx4\" (UniqueName: \"kubernetes.io/projected/1d51c7ac-111d-46e8-903f-01f29e4221ac-kube-api-access-mcsx4\") pod \"1d51c7ac-111d-46e8-903f-01f29e4221ac\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.315805 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-logs\") pod \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.315873 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-svc\") pod \"1d51c7ac-111d-46e8-903f-01f29e4221ac\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.315889 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-swift-storage-0\") pod \"1d51c7ac-111d-46e8-903f-01f29e4221ac\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.315908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-config\") pod \"1d51c7ac-111d-46e8-903f-01f29e4221ac\" (UID: \"1d51c7ac-111d-46e8-903f-01f29e4221ac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.315973 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-horizon-secret-key\") pod \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\" (UID: \"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-sys\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmhr\" (UniqueName: \"kubernetes.io/projected/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-kube-api-access-jwmhr\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-dev\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316279 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-lib-modules\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-scripts\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316367 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-run\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-ceph\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316413 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316455 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjwd\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-kube-api-access-prjwd\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316502 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-svc\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-config\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316656 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316696 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316754 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgz56\" (UniqueName: \"kubernetes.io/projected/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-kube-api-access-vgz56\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316765 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316775 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316783 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.316869 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.323939 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-sys\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.324171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.324258 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-logs" (OuterVolumeSpecName: "logs") pod "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" (UID: "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.325605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.326255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.331364 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.329803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.332008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-ceph\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.332390 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-dev\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.332426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-lib-modules\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.332616 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.336909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.337053 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.337081 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-run\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.337885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d51c7ac-111d-46e8-903f-01f29e4221ac-kube-api-access-mcsx4" (OuterVolumeSpecName: "kube-api-access-mcsx4") pod "1d51c7ac-111d-46e8-903f-01f29e4221ac" (UID: "1d51c7ac-111d-46e8-903f-01f29e4221ac"). InnerVolumeSpecName "kube-api-access-mcsx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.339019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d51c7ac-111d-46e8-903f-01f29e4221ac" (UID: "1d51c7ac-111d-46e8-903f-01f29e4221ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.341356 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" (UID: "6a5acc3b-0431-490e-b3c8-3b2ffa682f8b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.341692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-svc\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.345995 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-config\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.351110 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.353352 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.355388 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.357940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.374436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-scripts\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.374431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjwd\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-kube-api-access-prjwd\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.374648 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.375197 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.380688 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmhr\" (UniqueName: \"kubernetes.io/projected/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-kube-api-access-jwmhr\") pod \"dnsmasq-dns-6578955fd5-76whq\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.381596 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.388548 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" containerName="ceilometer-notification-agent" containerID="cri-o://3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9" gracePeriod=30 Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.389521 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c58bbcd67-dxpcc" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.391936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c58bbcd67-dxpcc" event={"ID":"6a5acc3b-0431-490e-b3c8-3b2ffa682f8b","Type":"ContainerDied","Data":"ad362fe8c86aa9064ae7e4957010462f8ca217a0f84a3acb8a2c8792e1a1a26b"} Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.392007 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" containerName="sg-core" containerID="cri-o://84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5" gracePeriod=30 Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.393005 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.420625 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.420943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.421057 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.421150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.421272 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjrvd\" (UniqueName: \"kubernetes.io/projected/38279d60-7565-460d-a703-b6aac3615f2c-kube-api-access-tjrvd\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.421359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-run\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.421468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.421564 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.421666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.423167 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.423333 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.423502 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rxk4\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-kube-api-access-8rxk4\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.423623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38279d60-7565-460d-a703-b6aac3615f2c-logs\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.423774 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.423936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-scripts\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.425401 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1d51c7ac-111d-46e8-903f-01f29e4221ac" (UID: "1d51c7ac-111d-46e8-903f-01f29e4221ac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.426204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.426317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.426372 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data-custom\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.426409 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.426464 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.426732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.426771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38279d60-7565-460d-a703-b6aac3615f2c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.426793 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.432885 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.432913 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.432927 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.432945 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcsx4\" (UniqueName: \"kubernetes.io/projected/1d51c7ac-111d-46e8-903f-01f29e4221ac-kube-api-access-mcsx4\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.432956 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.445298 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d51c7ac-111d-46e8-903f-01f29e4221ac" (UID: "1d51c7ac-111d-46e8-903f-01f29e4221ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.456733 4743 scope.go:117] "RemoveContainer" containerID="81f3e686873c718856a3843947499b0170d2c149422e608f2e65d5bc0e28f3b8" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.462727 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-config" (OuterVolumeSpecName: "config") pod "1d51c7ac-111d-46e8-903f-01f29e4221ac" (UID: "1d51c7ac-111d-46e8-903f-01f29e4221ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.481897 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.507233 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536099 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjrvd\" (UniqueName: \"kubernetes.io/projected/38279d60-7565-460d-a703-b6aac3615f2c-kube-api-access-tjrvd\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536391 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-run\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536465 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536558 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536608 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rxk4\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-kube-api-access-8rxk4\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536681 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38279d60-7565-460d-a703-b6aac3615f2c-logs\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536723 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536771 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-scripts\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.537442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.536829 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.537768 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.537910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data-custom\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.538016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.538101 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.538187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.538317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38279d60-7565-460d-a703-b6aac3615f2c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.538423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.538745 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.538838 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d51c7ac-111d-46e8-903f-01f29e4221ac-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.540761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.541190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-run\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.541274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.541611 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.541665 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.541713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.541755 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.541830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.542204 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38279d60-7565-460d-a703-b6aac3615f2c-logs\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.543141 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.543143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.543675 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38279d60-7565-460d-a703-b6aac3615f2c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.543836 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.545210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.545770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.546026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.546236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-scripts\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.550075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data-custom\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.566221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjrvd\" (UniqueName: \"kubernetes.io/projected/38279d60-7565-460d-a703-b6aac3615f2c-kube-api-access-tjrvd\") pod \"cinder-api-0\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.566935 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rxk4\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-kube-api-access-8rxk4\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.591605 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c58bbcd67-dxpcc"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.600281 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c58bbcd67-dxpcc"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.606451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.728957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.794930 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.812327 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-jzmg2"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.832357 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-jzmg2"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.848330 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-public-tls-certs\") pod \"94f36e37-4d73-48c9-a64b-810a95ed7bac\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.848399 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-combined-ca-bundle\") pod \"94f36e37-4d73-48c9-a64b-810a95ed7bac\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.848485 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-internal-tls-certs\") pod \"94f36e37-4d73-48c9-a64b-810a95ed7bac\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.848566 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-ovndb-tls-certs\") pod \"94f36e37-4d73-48c9-a64b-810a95ed7bac\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.848623 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmkbv\" (UniqueName: \"kubernetes.io/projected/94f36e37-4d73-48c9-a64b-810a95ed7bac-kube-api-access-qmkbv\") pod \"94f36e37-4d73-48c9-a64b-810a95ed7bac\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.848735 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-config\") pod \"94f36e37-4d73-48c9-a64b-810a95ed7bac\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.848906 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-httpd-config\") pod \"94f36e37-4d73-48c9-a64b-810a95ed7bac\" (UID: \"94f36e37-4d73-48c9-a64b-810a95ed7bac\") " Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.858928 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f36e37-4d73-48c9-a64b-810a95ed7bac-kube-api-access-qmkbv" (OuterVolumeSpecName: "kube-api-access-qmkbv") pod "94f36e37-4d73-48c9-a64b-810a95ed7bac" (UID: "94f36e37-4d73-48c9-a64b-810a95ed7bac"). InnerVolumeSpecName "kube-api-access-qmkbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.859071 4743 scope.go:117] "RemoveContainer" containerID="68119a62955bb4be3b995ec2294bc6a9da4607d114cadc47e5a26edfc51ddbb3" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.863559 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f8646897-kmnvw"] Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.870218 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "94f36e37-4d73-48c9-a64b-810a95ed7bac" (UID: "94f36e37-4d73-48c9-a64b-810a95ed7bac"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.881289 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.954438 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmkbv\" (UniqueName: \"kubernetes.io/projected/94f36e37-4d73-48c9-a64b-810a95ed7bac-kube-api-access-qmkbv\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.954467 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.973133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94f36e37-4d73-48c9-a64b-810a95ed7bac" (UID: "94f36e37-4d73-48c9-a64b-810a95ed7bac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.992548 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-config" (OuterVolumeSpecName: "config") pod "94f36e37-4d73-48c9-a64b-810a95ed7bac" (UID: "94f36e37-4d73-48c9-a64b-810a95ed7bac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:50 crc kubenswrapper[4743]: I0310 15:27:50.992934 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "94f36e37-4d73-48c9-a64b-810a95ed7bac" (UID: "94f36e37-4d73-48c9-a64b-810a95ed7bac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.031306 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "94f36e37-4d73-48c9-a64b-810a95ed7bac" (UID: "94f36e37-4d73-48c9-a64b-810a95ed7bac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.057381 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.057418 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.057431 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.057443 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.059601 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "94f36e37-4d73-48c9-a64b-810a95ed7bac" (UID: "94f36e37-4d73-48c9-a64b-810a95ed7bac"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.159974 4743 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f36e37-4d73-48c9-a64b-810a95ed7bac-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.177041 4743 scope.go:117] "RemoveContainer" containerID="3690d14ef39973676eb199ac0c743ebaacea85a3090a411e71de0ba1a770ca3e" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.415288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5855c85b77-4c45c" event={"ID":"94f36e37-4d73-48c9-a64b-810a95ed7bac","Type":"ContainerDied","Data":"7717c70c1c67fdcf176cad5dc51ce08a1c89056a8d83a8e6bbfc1a1f445c980d"} Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.415611 4743 scope.go:117] "RemoveContainer" containerID="0a0d1f7f3393d7f31400370f146d4299b8f97dde9d368b1123819a2d7c31d935" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.415445 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5855c85b77-4c45c" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.432029 4743 generic.go:334] "Generic (PLEG): container finished" podID="482b3103-f6d6-410f-9106-b10ad1695c78" containerID="84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5" exitCode=2 Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.432415 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"482b3103-f6d6-410f-9106-b10ad1695c78","Type":"ContainerDied","Data":"84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5"} Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.460014 4743 scope.go:117] "RemoveContainer" containerID="38cb9d98a25a82ad345c96bfd981bf7e7b4553c04e203f99a2a717bb6314dd7d" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.469116 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5855c85b77-4c45c"] Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.471687 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f8646897-kmnvw" event={"ID":"7513fc87-13e7-4273-98eb-fda8dd8d0305","Type":"ContainerStarted","Data":"18c5a5a19c38135136733ca2cb652ef12f8aa63f4453f3a3a58ddf7a96065ba8"} Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.487712 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5855c85b77-4c45c"] Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.549389 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-76whq"] Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.568485 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.702073 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.755922 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.968725 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" path="/var/lib/kubelet/pods/1d51c7ac-111d-46e8-903f-01f29e4221ac/volumes" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.969764 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a5acc3b-0431-490e-b3c8-3b2ffa682f8b" path="/var/lib/kubelet/pods/6a5acc3b-0431-490e-b3c8-3b2ffa682f8b/volumes" Mar 10 15:27:51 crc kubenswrapper[4743]: I0310 15:27:51.970510 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" path="/var/lib/kubelet/pods/94f36e37-4d73-48c9-a64b-810a95ed7bac/volumes" Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.045215 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.495476 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d4c7db42-928a-4d49-95df-e2073ad24b21","Type":"ContainerStarted","Data":"ec79dee920edc940df1a7b6bfd548fcabc44ad49acdf23bd874858b580c22022"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.511031 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"943d6458-faf7-4ed9-b883-51bfae20d07e","Type":"ContainerStarted","Data":"616fd3bfb3ccc3129f81ec25307ccde2708427e2e3f9640c37f246993d1fb37c"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.527368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38279d60-7565-460d-a703-b6aac3615f2c","Type":"ContainerStarted","Data":"555e7d56ce1ea5496f618063bd838d008834a067c6d19263ca93a1a6aad9611a"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.538804 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f8646897-kmnvw" event={"ID":"7513fc87-13e7-4273-98eb-fda8dd8d0305","Type":"ContainerStarted","Data":"5b59618d5abddfea705bfdbd758fe33ca71d09b92c4a2b7461562ddaac4dbc5c"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.538964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f8646897-kmnvw" event={"ID":"7513fc87-13e7-4273-98eb-fda8dd8d0305","Type":"ContainerStarted","Data":"33b98b5e596abb6e9e3ee7d930f60991c78feec36a1e221fa0df3b984b11fdf9"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.540919 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.551258 4743 generic.go:334] "Generic (PLEG): container finished" podID="45477f7e-f216-40fb-acdb-d7a1dbadba99" containerID="7469fa0f036b039a82ebd805a4b2cf98465a614c7a0da55c947089bbf1fc3e9b" exitCode=0 Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.551335 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lgtjg" event={"ID":"45477f7e-f216-40fb-acdb-d7a1dbadba99","Type":"ContainerDied","Data":"7469fa0f036b039a82ebd805a4b2cf98465a614c7a0da55c947089bbf1fc3e9b"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.553779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2a12054f-0a1c-4294-8855-bcb45a1e3684","Type":"ContainerStarted","Data":"6df130ee9f012ae80f658429dbc96ea8d85430dc769ff75d2c7d52d7d68af962"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.555301 4743 generic.go:334] "Generic (PLEG): container finished" podID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" containerID="5bb36f4a4448b1c9b4f7e960684bb9b8fc1f491337262ba5647a451cad09750a" exitCode=0 Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.555332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-76whq" event={"ID":"2bb57920-f14f-4e98-bea5-08e3aa17ffb1","Type":"ContainerDied","Data":"5bb36f4a4448b1c9b4f7e960684bb9b8fc1f491337262ba5647a451cad09750a"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.555363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-76whq" event={"ID":"2bb57920-f14f-4e98-bea5-08e3aa17ffb1","Type":"ContainerStarted","Data":"10f272e59b6080743d855852f51bda324a4ffc1dae7bcf3f0f89620c953962f6"} Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.598438 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56f8646897-kmnvw" podStartSLOduration=8.598422661 podStartE2EDuration="8.598422661s" podCreationTimestamp="2026-03-10 15:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:52.594179741 +0000 UTC m=+1337.300994489" watchObservedRunningTime="2026-03-10 15:27:52.598422661 +0000 UTC m=+1337.305237409" Mar 10 15:27:52 crc kubenswrapper[4743]: I0310 15:27:52.692425 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.346614 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.415509 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-config-data\") pod \"482b3103-f6d6-410f-9106-b10ad1695c78\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.415645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-scripts\") pod \"482b3103-f6d6-410f-9106-b10ad1695c78\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.415710 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-run-httpd\") pod \"482b3103-f6d6-410f-9106-b10ad1695c78\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.415792 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-log-httpd\") pod \"482b3103-f6d6-410f-9106-b10ad1695c78\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.415890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-sg-core-conf-yaml\") pod \"482b3103-f6d6-410f-9106-b10ad1695c78\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.415928 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbmzl\" (UniqueName: \"kubernetes.io/projected/482b3103-f6d6-410f-9106-b10ad1695c78-kube-api-access-hbmzl\") pod \"482b3103-f6d6-410f-9106-b10ad1695c78\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.415974 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-combined-ca-bundle\") pod \"482b3103-f6d6-410f-9106-b10ad1695c78\" (UID: \"482b3103-f6d6-410f-9106-b10ad1695c78\") " Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.418684 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "482b3103-f6d6-410f-9106-b10ad1695c78" (UID: "482b3103-f6d6-410f-9106-b10ad1695c78"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.418904 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "482b3103-f6d6-410f-9106-b10ad1695c78" (UID: "482b3103-f6d6-410f-9106-b10ad1695c78"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.426969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-scripts" (OuterVolumeSpecName: "scripts") pod "482b3103-f6d6-410f-9106-b10ad1695c78" (UID: "482b3103-f6d6-410f-9106-b10ad1695c78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.451158 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482b3103-f6d6-410f-9106-b10ad1695c78-kube-api-access-hbmzl" (OuterVolumeSpecName: "kube-api-access-hbmzl") pod "482b3103-f6d6-410f-9106-b10ad1695c78" (UID: "482b3103-f6d6-410f-9106-b10ad1695c78"). InnerVolumeSpecName "kube-api-access-hbmzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.454419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-config-data" (OuterVolumeSpecName: "config-data") pod "482b3103-f6d6-410f-9106-b10ad1695c78" (UID: "482b3103-f6d6-410f-9106-b10ad1695c78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.470609 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "482b3103-f6d6-410f-9106-b10ad1695c78" (UID: "482b3103-f6d6-410f-9106-b10ad1695c78"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.496096 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "482b3103-f6d6-410f-9106-b10ad1695c78" (UID: "482b3103-f6d6-410f-9106-b10ad1695c78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.519069 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.519127 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.519145 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbmzl\" (UniqueName: \"kubernetes.io/projected/482b3103-f6d6-410f-9106-b10ad1695c78-kube-api-access-hbmzl\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.519156 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.519169 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.519181 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482b3103-f6d6-410f-9106-b10ad1695c78-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.519191 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/482b3103-f6d6-410f-9106-b10ad1695c78-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.576910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-76whq" event={"ID":"2bb57920-f14f-4e98-bea5-08e3aa17ffb1","Type":"ContainerStarted","Data":"57d71a9b49c92580bb1171b0636f48c22d22afa7827c7b2cd4c6a9807980e759"} Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.578035 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.580964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"943d6458-faf7-4ed9-b883-51bfae20d07e","Type":"ContainerStarted","Data":"c2dae9e478ac1ef483d6362aadd56337b8bfa2da59946609f91a556a0a37d94c"} Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.583786 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38279d60-7565-460d-a703-b6aac3615f2c","Type":"ContainerStarted","Data":"2efc484f75e97c22037a95da5ff71f40ded1470d05ea07d0728da7b449951e4b"} Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.590695 4743 generic.go:334] "Generic (PLEG): container finished" podID="482b3103-f6d6-410f-9106-b10ad1695c78" containerID="3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9" exitCode=0 Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.590919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"482b3103-f6d6-410f-9106-b10ad1695c78","Type":"ContainerDied","Data":"3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9"} Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.590962 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"482b3103-f6d6-410f-9106-b10ad1695c78","Type":"ContainerDied","Data":"1acb1cf3d6c2af05dee47b1149592a0ec38eecf3d1d1997f3e2f3231ab5e63c0"} Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.590982 4743 scope.go:117] "RemoveContainer" containerID="84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.591131 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.601017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2a12054f-0a1c-4294-8855-bcb45a1e3684","Type":"ContainerStarted","Data":"507a736943caae863597c5bbd129ead8e0aa5bcab55f2caa3adba073be5b44cd"} Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.607937 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-76whq" podStartSLOduration=4.607907901 podStartE2EDuration="4.607907901s" podCreationTimestamp="2026-03-10 15:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:53.597856096 +0000 UTC m=+1338.304670844" watchObservedRunningTime="2026-03-10 15:27:53.607907901 +0000 UTC m=+1338.314722649" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.710023 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.720326 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736026 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:53 crc kubenswrapper[4743]: E0310 15:27:53.736456 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" containerName="ceilometer-notification-agent" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736474 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" containerName="ceilometer-notification-agent" Mar 10 15:27:53 crc kubenswrapper[4743]: E0310 15:27:53.736490 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" containerName="sg-core" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736496 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" containerName="sg-core" Mar 10 15:27:53 crc kubenswrapper[4743]: E0310 15:27:53.736517 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-httpd" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736523 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-httpd" Mar 10 15:27:53 crc kubenswrapper[4743]: E0310 15:27:53.736536 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-api" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736542 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-api" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736742 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-api" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736756 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" containerName="ceilometer-notification-agent" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736768 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" containerName="sg-core" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.736780 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f36e37-4d73-48c9-a64b-810a95ed7bac" containerName="neutron-httpd" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.738783 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.745574 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.745741 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.748448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.826046 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-run-httpd\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.826462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-config-data\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.826494 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.826527 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jbt7\" (UniqueName: \"kubernetes.io/projected/ae9e10bd-56f9-4223-a2ea-9eadfe923042-kube-api-access-8jbt7\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.826580 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-log-httpd\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.826607 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.826632 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-scripts\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.867783 4743 scope.go:117] "RemoveContainer" containerID="3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.921838 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-jzmg2" podUID="1d51c7ac-111d-46e8-903f-01f29e4221ac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: i/o timeout" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.929184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-run-httpd\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.929228 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-config-data\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.929258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.929291 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jbt7\" (UniqueName: \"kubernetes.io/projected/ae9e10bd-56f9-4223-a2ea-9eadfe923042-kube-api-access-8jbt7\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.929338 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-log-httpd\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.929363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.929392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-scripts\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.934339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-run-httpd\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.934729 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-log-httpd\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.937530 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-scripts\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.938623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-config-data\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.945718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.945901 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482b3103-f6d6-410f-9106-b10ad1695c78" path="/var/lib/kubelet/pods/482b3103-f6d6-410f-9106-b10ad1695c78/volumes" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.946016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.952181 4743 scope.go:117] "RemoveContainer" containerID="84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5" Mar 10 15:27:53 crc kubenswrapper[4743]: E0310 15:27:53.959016 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5\": container with ID starting with 84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5 not found: ID does not exist" containerID="84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.959059 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5"} err="failed to get container status \"84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5\": rpc error: code = NotFound desc = could not find container \"84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5\": container with ID starting with 84bf9194b61af0fa3b600b584a0b07babe4d3527a65dc48c174fb0a0de5c53a5 not found: ID does not exist" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.959090 4743 scope.go:117] "RemoveContainer" containerID="3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.959606 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jbt7\" (UniqueName: \"kubernetes.io/projected/ae9e10bd-56f9-4223-a2ea-9eadfe923042-kube-api-access-8jbt7\") pod \"ceilometer-0\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " pod="openstack/ceilometer-0" Mar 10 15:27:53 crc kubenswrapper[4743]: E0310 15:27:53.963194 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9\": container with ID starting with 3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9 not found: ID does not exist" containerID="3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9" Mar 10 15:27:53 crc kubenswrapper[4743]: I0310 15:27:53.963268 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9"} err="failed to get container status \"3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9\": rpc error: code = NotFound desc = could not find container \"3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9\": container with ID starting with 3c6be5e98eaf02c69500c31078b1e63aac495e5c15ef338c11ad7e478b3fa5c9 not found: ID does not exist" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.067534 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.247952 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7b59b66d-df78r" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": read tcp 10.217.0.2:41178->10.217.0.172:9311: read: connection reset by peer" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.248275 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7b59b66d-df78r" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": read tcp 10.217.0.2:41162->10.217.0.172:9311: read: connection reset by peer" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.371512 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lgtjg" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.445528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-combined-ca-bundle\") pod \"45477f7e-f216-40fb-acdb-d7a1dbadba99\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.445610 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpq2h\" (UniqueName: \"kubernetes.io/projected/45477f7e-f216-40fb-acdb-d7a1dbadba99-kube-api-access-rpq2h\") pod \"45477f7e-f216-40fb-acdb-d7a1dbadba99\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.445744 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-job-config-data\") pod \"45477f7e-f216-40fb-acdb-d7a1dbadba99\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.445885 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-config-data\") pod \"45477f7e-f216-40fb-acdb-d7a1dbadba99\" (UID: \"45477f7e-f216-40fb-acdb-d7a1dbadba99\") " Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.474951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45477f7e-f216-40fb-acdb-d7a1dbadba99-kube-api-access-rpq2h" (OuterVolumeSpecName: "kube-api-access-rpq2h") pod "45477f7e-f216-40fb-acdb-d7a1dbadba99" (UID: "45477f7e-f216-40fb-acdb-d7a1dbadba99"). InnerVolumeSpecName "kube-api-access-rpq2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.478343 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "45477f7e-f216-40fb-acdb-d7a1dbadba99" (UID: "45477f7e-f216-40fb-acdb-d7a1dbadba99"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.513090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-config-data" (OuterVolumeSpecName: "config-data") pod "45477f7e-f216-40fb-acdb-d7a1dbadba99" (UID: "45477f7e-f216-40fb-acdb-d7a1dbadba99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.548884 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpq2h\" (UniqueName: \"kubernetes.io/projected/45477f7e-f216-40fb-acdb-d7a1dbadba99-kube-api-access-rpq2h\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.548936 4743 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.548946 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.629506 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45477f7e-f216-40fb-acdb-d7a1dbadba99" (UID: "45477f7e-f216-40fb-acdb-d7a1dbadba99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.650668 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45477f7e-f216-40fb-acdb-d7a1dbadba99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.693761 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38279d60-7565-460d-a703-b6aac3615f2c","Type":"ContainerStarted","Data":"c8cbbb7f60ea28cb5a3d805adcaa560c6f83d49eb8df01efc76ebe87bbbc4688"} Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.694015 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api-log" containerID="cri-o://2efc484f75e97c22037a95da5ff71f40ded1470d05ea07d0728da7b449951e4b" gracePeriod=30 Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.694129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.694725 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api" containerID="cri-o://c8cbbb7f60ea28cb5a3d805adcaa560c6f83d49eb8df01efc76ebe87bbbc4688" gracePeriod=30 Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.728685 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.728651773 podStartE2EDuration="4.728651773s" podCreationTimestamp="2026-03-10 15:27:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:54.721957823 +0000 UTC m=+1339.428772591" watchObservedRunningTime="2026-03-10 15:27:54.728651773 +0000 UTC m=+1339.435466521" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.740547 4743 generic.go:334] "Generic (PLEG): container finished" podID="7f942785-954d-4441-ac29-69e7b65ead94" containerID="d6fdde5ec4adfd5b915d588ff90fa45e205b51316b675665145cfb4b6acbf4b0" exitCode=0 Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.740934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b59b66d-df78r" event={"ID":"7f942785-954d-4441-ac29-69e7b65ead94","Type":"ContainerDied","Data":"d6fdde5ec4adfd5b915d588ff90fa45e205b51316b675665145cfb4b6acbf4b0"} Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.758586 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lgtjg" event={"ID":"45477f7e-f216-40fb-acdb-d7a1dbadba99","Type":"ContainerDied","Data":"f3cf19d30864c06e7978df8f3c591e44269951e51d874a164ee596b12a048ab8"} Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.758651 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3cf19d30864c06e7978df8f3c591e44269951e51d874a164ee596b12a048ab8" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.758844 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lgtjg" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.926464 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:27:54 crc kubenswrapper[4743]: E0310 15:27:54.944881 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45477f7e-f216-40fb-acdb-d7a1dbadba99" containerName="manila-db-sync" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.944941 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="45477f7e-f216-40fb-acdb-d7a1dbadba99" containerName="manila-db-sync" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.945177 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="45477f7e-f216-40fb-acdb-d7a1dbadba99" containerName="manila-db-sync" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.946566 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.964616 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.964806 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.964985 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.965135 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-nj79g" Mar 10 15:27:54 crc kubenswrapper[4743]: I0310 15:27:54.974464 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.067922 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.080998 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-scripts\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.081060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.081087 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8wrx\" (UniqueName: \"kubernetes.io/projected/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-kube-api-access-q8wrx\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.081135 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.081181 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.081313 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.119109 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.185458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-scripts\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.185520 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.185549 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8wrx\" (UniqueName: \"kubernetes.io/projected/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-kube-api-access-q8wrx\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.185603 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.185643 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.185708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.197985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.199125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.199768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-scripts\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.208754 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.216780 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.235689 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-76whq"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.240342 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8wrx\" (UniqueName: \"kubernetes.io/projected/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-kube-api-access-q8wrx\") pod \"manila-scheduler-0\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.249218 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.287165 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.295741 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.348732 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-hgqrq"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.351423 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.372159 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.387537 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.387928 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-hgqrq"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.388439 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.398676 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f942785-954d-4441-ac29-69e7b65ead94-logs\") pod \"7f942785-954d-4441-ac29-69e7b65ead94\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.399000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data-custom\") pod \"7f942785-954d-4441-ac29-69e7b65ead94\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.399060 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data\") pod \"7f942785-954d-4441-ac29-69e7b65ead94\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.399114 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ntwf\" (UniqueName: \"kubernetes.io/projected/7f942785-954d-4441-ac29-69e7b65ead94-kube-api-access-9ntwf\") pod \"7f942785-954d-4441-ac29-69e7b65ead94\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.399285 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-combined-ca-bundle\") pod \"7f942785-954d-4441-ac29-69e7b65ead94\" (UID: \"7f942785-954d-4441-ac29-69e7b65ead94\") " Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.399623 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 10 15:27:55 crc kubenswrapper[4743]: E0310 15:27:55.400114 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api-log" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400162 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api-log" Mar 10 15:27:55 crc kubenswrapper[4743]: E0310 15:27:55.400186 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400192 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400276 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-sb\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400410 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400425 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f942785-954d-4441-ac29-69e7b65ead94" containerName="barbican-api-log" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-swift-storage-0\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400511 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-config\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400589 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbt6r\" (UniqueName: \"kubernetes.io/projected/593624b1-1f23-4fdb-8b94-00837da810bc-kube-api-access-fbt6r\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400673 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400747 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-svc\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-ceph\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.400924 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.401011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.401037 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-scripts\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.401055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd9bz\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-kube-api-access-zd9bz\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.401103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-nb\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.402571 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.405659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f942785-954d-4441-ac29-69e7b65ead94-logs" (OuterVolumeSpecName: "logs") pod "7f942785-954d-4441-ac29-69e7b65ead94" (UID: "7f942785-954d-4441-ac29-69e7b65ead94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.411282 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.421372 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f942785-954d-4441-ac29-69e7b65ead94" (UID: "7f942785-954d-4441-ac29-69e7b65ead94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.423492 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f942785-954d-4441-ac29-69e7b65ead94-kube-api-access-9ntwf" (OuterVolumeSpecName: "kube-api-access-9ntwf") pod "7f942785-954d-4441-ac29-69e7b65ead94" (UID: "7f942785-954d-4441-ac29-69e7b65ead94"). InnerVolumeSpecName "kube-api-access-9ntwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.426211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.459885 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.513583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-sb\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-swift-storage-0\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-scripts\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-config\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522591 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnt6z\" (UniqueName: \"kubernetes.io/projected/d528d011-c6fb-4786-8d66-1fc289bd91cc-kube-api-access-pnt6z\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522662 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbt6r\" (UniqueName: \"kubernetes.io/projected/593624b1-1f23-4fdb-8b94-00837da810bc-kube-api-access-fbt6r\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522750 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d528d011-c6fb-4786-8d66-1fc289bd91cc-etc-machine-id\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.522848 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d528d011-c6fb-4786-8d66-1fc289bd91cc-logs\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.517557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-sb\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.523507 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-svc\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-ceph\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525427 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525539 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525575 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-scripts\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525604 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data-custom\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd9bz\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-kube-api-access-zd9bz\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525673 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525701 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-nb\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525798 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f942785-954d-4441-ac29-69e7b65ead94-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525830 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.525843 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ntwf\" (UniqueName: \"kubernetes.io/projected/7f942785-954d-4441-ac29-69e7b65ead94-kube-api-access-9ntwf\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.534894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.535771 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-nb\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.536312 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-swift-storage-0\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.537782 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-config\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.538111 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-svc\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.539181 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-scripts\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.539253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.540305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-ceph\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.544726 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.546157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.548186 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd9bz\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-kube-api-access-zd9bz\") pod \"manila-share-share1-0\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.576790 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbt6r\" (UniqueName: \"kubernetes.io/projected/593624b1-1f23-4fdb-8b94-00837da810bc-kube-api-access-fbt6r\") pod \"dnsmasq-dns-57d6d889f-hgqrq\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.577147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f942785-954d-4441-ac29-69e7b65ead94" (UID: "7f942785-954d-4441-ac29-69e7b65ead94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.628028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.628358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnt6z\" (UniqueName: \"kubernetes.io/projected/d528d011-c6fb-4786-8d66-1fc289bd91cc-kube-api-access-pnt6z\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.628417 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d528d011-c6fb-4786-8d66-1fc289bd91cc-etc-machine-id\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.628439 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d528d011-c6fb-4786-8d66-1fc289bd91cc-logs\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.628550 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data-custom\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.628582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.628640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-scripts\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.628697 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.629052 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d528d011-c6fb-4786-8d66-1fc289bd91cc-logs\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.629371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d528d011-c6fb-4786-8d66-1fc289bd91cc-etc-machine-id\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.645745 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.646619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data-custom\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.650528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data" (OuterVolumeSpecName: "config-data") pod "7f942785-954d-4441-ac29-69e7b65ead94" (UID: "7f942785-954d-4441-ac29-69e7b65ead94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.652311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-scripts\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.659163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.661296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.683487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnt6z\" (UniqueName: \"kubernetes.io/projected/d528d011-c6fb-4786-8d66-1fc289bd91cc-kube-api-access-pnt6z\") pod \"manila-api-0\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.703789 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.738906 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f942785-954d-4441-ac29-69e7b65ead94-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.753297 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.753396 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.754326 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"8219c4b41eb7545ea397447365b146eeb777554480ab78e1282ead6e8b54a642"} pod="openstack/horizon-7954db6464-ns5cf" containerMessage="Container horizon failed startup probe, will be restarted" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.754367 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" containerID="cri-o://8219c4b41eb7545ea397447365b146eeb777554480ab78e1282ead6e8b54a642" gracePeriod=30 Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.765793 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.816053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2a12054f-0a1c-4294-8855-bcb45a1e3684","Type":"ContainerStarted","Data":"688cab3c354d213efc05f8ff649dcfe221e23e51d386b88b7335489c46050c48"} Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.838583 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-958fd895b-mxn2t" podUID="cccf05c8-d4e8-4a1d-912f-5f4a37440ac7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.838693 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.854225 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64b84f4b48-6qhqj"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.856705 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64b84f4b48-6qhqj"] Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.856794 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.856987 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"c8e35c99a898ab5b18b13aae107bf529b1f8d349a368587867349ca074eddf3a"} pod="openstack/horizon-958fd895b-mxn2t" containerMessage="Container horizon failed startup probe, will be restarted" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.857040 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-958fd895b-mxn2t" podUID="cccf05c8-d4e8-4a1d-912f-5f4a37440ac7" containerName="horizon" containerID="cri-o://c8e35c99a898ab5b18b13aae107bf529b1f8d349a368587867349ca074eddf3a" gracePeriod=30 Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.867216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d4c7db42-928a-4d49-95df-e2073ad24b21","Type":"ContainerStarted","Data":"627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70"} Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.867274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d4c7db42-928a-4d49-95df-e2073ad24b21","Type":"ContainerStarted","Data":"3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0"} Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.885071 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.889320 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.618052491 podStartE2EDuration="5.889305945s" podCreationTimestamp="2026-03-10 15:27:50 +0000 UTC" firstStartedPulling="2026-03-10 15:27:51.81137912 +0000 UTC m=+1336.518193868" lastFinishedPulling="2026-03-10 15:27:53.082632574 +0000 UTC m=+1337.789447322" observedRunningTime="2026-03-10 15:27:55.886226618 +0000 UTC m=+1340.593041366" watchObservedRunningTime="2026-03-10 15:27:55.889305945 +0000 UTC m=+1340.596120693" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.892371 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"943d6458-faf7-4ed9-b883-51bfae20d07e","Type":"ContainerStarted","Data":"129273effd7281d6673d014609e48e033f4188a95f53d8c08997f95894aefb88"} Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.921021 4743 generic.go:334] "Generic (PLEG): container finished" podID="38279d60-7565-460d-a703-b6aac3615f2c" containerID="2efc484f75e97c22037a95da5ff71f40ded1470d05ea07d0728da7b449951e4b" exitCode=143 Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.943210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-internal-tls-certs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.943381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-scripts\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.943450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-config-data\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.943499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jthk8\" (UniqueName: \"kubernetes.io/projected/99416e79-6415-4c9d-93b0-920307f57e4c-kube-api-access-jthk8\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.943566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-combined-ca-bundle\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.943804 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99416e79-6415-4c9d-93b0-920307f57e4c-logs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.943867 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-public-tls-certs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.952765 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7b59b66d-df78r" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.959193 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.174971592 podStartE2EDuration="5.959172664s" podCreationTimestamp="2026-03-10 15:27:50 +0000 UTC" firstStartedPulling="2026-03-10 15:27:52.166395335 +0000 UTC m=+1336.873210083" lastFinishedPulling="2026-03-10 15:27:53.950596417 +0000 UTC m=+1338.657411155" observedRunningTime="2026-03-10 15:27:55.944469248 +0000 UTC m=+1340.651283996" watchObservedRunningTime="2026-03-10 15:27:55.959172664 +0000 UTC m=+1340.665987402" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.968259 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38279d60-7565-460d-a703-b6aac3615f2c","Type":"ContainerDied","Data":"2efc484f75e97c22037a95da5ff71f40ded1470d05ea07d0728da7b449951e4b"} Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.969640 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b59b66d-df78r" event={"ID":"7f942785-954d-4441-ac29-69e7b65ead94","Type":"ContainerDied","Data":"4d7f02810547df8b25aca23259a587c4cf1b119718eb9e12f206d49a9ff93e02"} Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.969774 4743 scope.go:117] "RemoveContainer" containerID="d6fdde5ec4adfd5b915d588ff90fa45e205b51316b675665145cfb4b6acbf4b0" Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.982384 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerStarted","Data":"872f8243a2ec3bdae89d1e1f5a5ec93b22661aaa5301012aa6e2153d11ec5869"} Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.982787 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-76whq" podUID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" containerName="dnsmasq-dns" containerID="cri-o://57d71a9b49c92580bb1171b0636f48c22d22afa7827c7b2cd4c6a9807980e759" gracePeriod=10 Mar 10 15:27:55 crc kubenswrapper[4743]: I0310 15:27:55.985291 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.217024354 podStartE2EDuration="6.985270493s" podCreationTimestamp="2026-03-10 15:27:49 +0000 UTC" firstStartedPulling="2026-03-10 15:27:51.628315845 +0000 UTC m=+1336.335130583" lastFinishedPulling="2026-03-10 15:27:52.396561974 +0000 UTC m=+1337.103376722" observedRunningTime="2026-03-10 15:27:55.978293156 +0000 UTC m=+1340.685107904" watchObservedRunningTime="2026-03-10 15:27:55.985270493 +0000 UTC m=+1340.692085241" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.039049 4743 scope.go:117] "RemoveContainer" containerID="a44b0ab62e898d54d43645d8c85eb8aaa2e9faaa6c8a1ebe1d196220ca7d6e80" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.058305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-config-data\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.058421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jthk8\" (UniqueName: \"kubernetes.io/projected/99416e79-6415-4c9d-93b0-920307f57e4c-kube-api-access-jthk8\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.058529 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-combined-ca-bundle\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.058709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99416e79-6415-4c9d-93b0-920307f57e4c-logs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.058753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-public-tls-certs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.058793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-internal-tls-certs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.058972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-scripts\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.060547 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.068371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99416e79-6415-4c9d-93b0-920307f57e4c-logs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.077073 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-scripts\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.078508 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-config-data\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.075279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-internal-tls-certs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.085242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-public-tls-certs\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.086215 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99416e79-6415-4c9d-93b0-920307f57e4c-combined-ca-bundle\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.095911 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jthk8\" (UniqueName: \"kubernetes.io/projected/99416e79-6415-4c9d-93b0-920307f57e4c-kube-api-access-jthk8\") pod \"placement-64b84f4b48-6qhqj\" (UID: \"99416e79-6415-4c9d-93b0-920307f57e4c\") " pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.110017 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b7b59b66d-df78r"] Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.120580 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b7b59b66d-df78r"] Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.257918 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.517511 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.923246 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-hgqrq"] Mar 10 15:27:56 crc kubenswrapper[4743]: I0310 15:27:56.955422 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64b84f4b48-6qhqj"] Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.023104 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerStarted","Data":"230b1bc273437d1877bdcf2e4a915dd66e45a9a78f449e76179bc080091e4000"} Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.025741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"98c2fbbb-1c4c-421c-9d27-fae1884c9b54","Type":"ContainerStarted","Data":"539c675b1f75e292301db8581ad63c1c6896f433a40c382186814d80e250c10c"} Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.050014 4743 generic.go:334] "Generic (PLEG): container finished" podID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" containerID="57d71a9b49c92580bb1171b0636f48c22d22afa7827c7b2cd4c6a9807980e759" exitCode=0 Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.050526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-76whq" event={"ID":"2bb57920-f14f-4e98-bea5-08e3aa17ffb1","Type":"ContainerDied","Data":"57d71a9b49c92580bb1171b0636f48c22d22afa7827c7b2cd4c6a9807980e759"} Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.050579 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-76whq" event={"ID":"2bb57920-f14f-4e98-bea5-08e3aa17ffb1","Type":"ContainerDied","Data":"10f272e59b6080743d855852f51bda324a4ffc1dae7bcf3f0f89620c953962f6"} Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.050596 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10f272e59b6080743d855852f51bda324a4ffc1dae7bcf3f0f89620c953962f6" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.069948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"4b909bf2-989f-40bd-87ce-ccd96ec9d39e","Type":"ContainerStarted","Data":"1d8f5503ae1c00b7df8210ca51791146eb07edb8a081650513e943b537567148"} Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.069999 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.087076 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.223467 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-nb\") pod \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.223564 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-config\") pod \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.224059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-svc\") pod \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.224127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwmhr\" (UniqueName: \"kubernetes.io/projected/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-kube-api-access-jwmhr\") pod \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.224145 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-swift-storage-0\") pod \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.224390 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-sb\") pod \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\" (UID: \"2bb57920-f14f-4e98-bea5-08e3aa17ffb1\") " Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.247248 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-kube-api-access-jwmhr" (OuterVolumeSpecName: "kube-api-access-jwmhr") pod "2bb57920-f14f-4e98-bea5-08e3aa17ffb1" (UID: "2bb57920-f14f-4e98-bea5-08e3aa17ffb1"). InnerVolumeSpecName "kube-api-access-jwmhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.331108 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwmhr\" (UniqueName: \"kubernetes.io/projected/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-kube-api-access-jwmhr\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.347934 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2bb57920-f14f-4e98-bea5-08e3aa17ffb1" (UID: "2bb57920-f14f-4e98-bea5-08e3aa17ffb1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.370754 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bb57920-f14f-4e98-bea5-08e3aa17ffb1" (UID: "2bb57920-f14f-4e98-bea5-08e3aa17ffb1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.382766 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bb57920-f14f-4e98-bea5-08e3aa17ffb1" (UID: "2bb57920-f14f-4e98-bea5-08e3aa17ffb1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.407000 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2bb57920-f14f-4e98-bea5-08e3aa17ffb1" (UID: "2bb57920-f14f-4e98-bea5-08e3aa17ffb1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.438146 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.438187 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.438229 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.438242 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.439308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-config" (OuterVolumeSpecName: "config") pod "2bb57920-f14f-4e98-bea5-08e3aa17ffb1" (UID: "2bb57920-f14f-4e98-bea5-08e3aa17ffb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.540536 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb57920-f14f-4e98-bea5-08e3aa17ffb1-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:57 crc kubenswrapper[4743]: I0310 15:27:57.973918 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f942785-954d-4441-ac29-69e7b65ead94" path="/var/lib/kubelet/pods/7f942785-954d-4441-ac29-69e7b65ead94/volumes" Mar 10 15:27:58 crc kubenswrapper[4743]: I0310 15:27:58.094165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64b84f4b48-6qhqj" event={"ID":"99416e79-6415-4c9d-93b0-920307f57e4c","Type":"ContainerStarted","Data":"17dfe6639309daed0cfafb15cac37076aabcd50275b17c5dece48f6c444ea54f"} Mar 10 15:27:58 crc kubenswrapper[4743]: I0310 15:27:58.094228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64b84f4b48-6qhqj" event={"ID":"99416e79-6415-4c9d-93b0-920307f57e4c","Type":"ContainerStarted","Data":"3420e654c776cb415174744a99e399edf5a35ebef290f9ab9ebfa6eea14f46fa"} Mar 10 15:27:58 crc kubenswrapper[4743]: I0310 15:27:58.107923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d528d011-c6fb-4786-8d66-1fc289bd91cc","Type":"ContainerStarted","Data":"79a6156ece062193ab4263278ef85446808cabc8f88e34048c8b05006f1746b4"} Mar 10 15:27:58 crc kubenswrapper[4743]: I0310 15:27:58.122935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerStarted","Data":"67c2317d09f90aa678ba4358c6d39f9ed608fd6a422c895fb4a0d598a2b0ff10"} Mar 10 15:27:58 crc kubenswrapper[4743]: I0310 15:27:58.138468 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-76whq" Mar 10 15:27:58 crc kubenswrapper[4743]: I0310 15:27:58.139586 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" event={"ID":"593624b1-1f23-4fdb-8b94-00837da810bc","Type":"ContainerStarted","Data":"81f5e97c4992dcf4e8739c11b9aa9d88fd244b45144544f6bea1c21957f8c3a5"} Mar 10 15:27:58 crc kubenswrapper[4743]: I0310 15:27:58.226869 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-76whq"] Mar 10 15:27:58 crc kubenswrapper[4743]: I0310 15:27:58.248920 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-76whq"] Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.076645 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.157310 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64b84f4b48-6qhqj" event={"ID":"99416e79-6415-4c9d-93b0-920307f57e4c","Type":"ContainerStarted","Data":"89bf03db632ae626a3dd25837bf753660fcc7edbac73af12ffa84451d7fe4c7a"} Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.157454 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.163331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d528d011-c6fb-4786-8d66-1fc289bd91cc","Type":"ContainerStarted","Data":"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6"} Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.163357 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d528d011-c6fb-4786-8d66-1fc289bd91cc","Type":"ContainerStarted","Data":"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13"} Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.163465 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerName="manila-api-log" containerID="cri-o://d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13" gracePeriod=30 Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.163512 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerName="manila-api" containerID="cri-o://ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6" gracePeriod=30 Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.163488 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.170908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerStarted","Data":"5d8c01218feae892cd3966a52852ea056dc65d8845fda0bf23283cc668ceeee7"} Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.173069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"98c2fbbb-1c4c-421c-9d27-fae1884c9b54","Type":"ContainerStarted","Data":"439d72982dac18daacc3f0d4f42611ef8a18e0f95664b46c08015c1d0ca4f536"} Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.173112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"98c2fbbb-1c4c-421c-9d27-fae1884c9b54","Type":"ContainerStarted","Data":"71096faa0e0eeec1fe27cbe9fdbb571288c7b89b0a890a8aa52a127d11c51f69"} Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.177665 4743 generic.go:334] "Generic (PLEG): container finished" podID="593624b1-1f23-4fdb-8b94-00837da810bc" containerID="ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86" exitCode=0 Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.177701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" event={"ID":"593624b1-1f23-4fdb-8b94-00837da810bc","Type":"ContainerDied","Data":"ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86"} Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.198336 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64b84f4b48-6qhqj" podStartSLOduration=4.198320643 podStartE2EDuration="4.198320643s" podCreationTimestamp="2026-03-10 15:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:59.184633496 +0000 UTC m=+1343.891448244" watchObservedRunningTime="2026-03-10 15:27:59.198320643 +0000 UTC m=+1343.905135391" Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.287035 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.287012615 podStartE2EDuration="4.287012615s" podCreationTimestamp="2026-03-10 15:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:59.248302419 +0000 UTC m=+1343.955117157" watchObservedRunningTime="2026-03-10 15:27:59.287012615 +0000 UTC m=+1343.993827363" Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.309302 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.46551956 podStartE2EDuration="5.309281926s" podCreationTimestamp="2026-03-10 15:27:54 +0000 UTC" firstStartedPulling="2026-03-10 15:27:56.205990355 +0000 UTC m=+1340.912805103" lastFinishedPulling="2026-03-10 15:27:57.049752721 +0000 UTC m=+1341.756567469" observedRunningTime="2026-03-10 15:27:59.293692005 +0000 UTC m=+1344.000506753" watchObservedRunningTime="2026-03-10 15:27:59.309281926 +0000 UTC m=+1344.016096674" Mar 10 15:27:59 crc kubenswrapper[4743]: I0310 15:27:59.935098 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" path="/var/lib/kubelet/pods/2bb57920-f14f-4e98-bea5-08e3aa17ffb1/volumes" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.039479 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.133524 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552608-5gj6v"] Mar 10 15:28:00 crc kubenswrapper[4743]: E0310 15:28:00.133999 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerName="manila-api-log" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.134013 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerName="manila-api-log" Mar 10 15:28:00 crc kubenswrapper[4743]: E0310 15:28:00.134030 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" containerName="dnsmasq-dns" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.134036 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" containerName="dnsmasq-dns" Mar 10 15:28:00 crc kubenswrapper[4743]: E0310 15:28:00.134054 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" containerName="init" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.134061 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" containerName="init" Mar 10 15:28:00 crc kubenswrapper[4743]: E0310 15:28:00.134081 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerName="manila-api" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.134087 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerName="manila-api" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.134264 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerName="manila-api-log" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.134276 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb57920-f14f-4e98-bea5-08e3aa17ffb1" containerName="dnsmasq-dns" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.134286 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerName="manila-api" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.134993 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552608-5gj6v" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.138910 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.138912 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.140601 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.146377 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data\") pod \"d528d011-c6fb-4786-8d66-1fc289bd91cc\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.146426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d528d011-c6fb-4786-8d66-1fc289bd91cc-etc-machine-id\") pod \"d528d011-c6fb-4786-8d66-1fc289bd91cc\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.146471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnt6z\" (UniqueName: \"kubernetes.io/projected/d528d011-c6fb-4786-8d66-1fc289bd91cc-kube-api-access-pnt6z\") pod \"d528d011-c6fb-4786-8d66-1fc289bd91cc\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.146563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-combined-ca-bundle\") pod \"d528d011-c6fb-4786-8d66-1fc289bd91cc\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.146670 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d528d011-c6fb-4786-8d66-1fc289bd91cc-logs\") pod \"d528d011-c6fb-4786-8d66-1fc289bd91cc\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.146854 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-scripts\") pod \"d528d011-c6fb-4786-8d66-1fc289bd91cc\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.146910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data-custom\") pod \"d528d011-c6fb-4786-8d66-1fc289bd91cc\" (UID: \"d528d011-c6fb-4786-8d66-1fc289bd91cc\") " Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.151584 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d528d011-c6fb-4786-8d66-1fc289bd91cc-logs" (OuterVolumeSpecName: "logs") pod "d528d011-c6fb-4786-8d66-1fc289bd91cc" (UID: "d528d011-c6fb-4786-8d66-1fc289bd91cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.157112 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d528d011-c6fb-4786-8d66-1fc289bd91cc-kube-api-access-pnt6z" (OuterVolumeSpecName: "kube-api-access-pnt6z") pod "d528d011-c6fb-4786-8d66-1fc289bd91cc" (UID: "d528d011-c6fb-4786-8d66-1fc289bd91cc"). InnerVolumeSpecName "kube-api-access-pnt6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.158155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d528d011-c6fb-4786-8d66-1fc289bd91cc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d528d011-c6fb-4786-8d66-1fc289bd91cc" (UID: "d528d011-c6fb-4786-8d66-1fc289bd91cc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.159906 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d528d011-c6fb-4786-8d66-1fc289bd91cc" (UID: "d528d011-c6fb-4786-8d66-1fc289bd91cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.162370 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552608-5gj6v"] Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.162783 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-scripts" (OuterVolumeSpecName: "scripts") pod "d528d011-c6fb-4786-8d66-1fc289bd91cc" (UID: "d528d011-c6fb-4786-8d66-1fc289bd91cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.216523 4743 generic.go:334] "Generic (PLEG): container finished" podID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerID="ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6" exitCode=143 Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.216571 4743 generic.go:334] "Generic (PLEG): container finished" podID="d528d011-c6fb-4786-8d66-1fc289bd91cc" containerID="d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13" exitCode=143 Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.216691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d528d011-c6fb-4786-8d66-1fc289bd91cc","Type":"ContainerDied","Data":"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6"} Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.216727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d528d011-c6fb-4786-8d66-1fc289bd91cc","Type":"ContainerDied","Data":"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13"} Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.216741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d528d011-c6fb-4786-8d66-1fc289bd91cc","Type":"ContainerDied","Data":"79a6156ece062193ab4263278ef85446808cabc8f88e34048c8b05006f1746b4"} Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.216761 4743 scope.go:117] "RemoveContainer" containerID="ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.216959 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.225011 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0001988-feba-4afe-9068-071af12a6fd7" containerID="8219c4b41eb7545ea397447365b146eeb777554480ab78e1282ead6e8b54a642" exitCode=0 Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.225091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7954db6464-ns5cf" event={"ID":"c0001988-feba-4afe-9068-071af12a6fd7","Type":"ContainerDied","Data":"8219c4b41eb7545ea397447365b146eeb777554480ab78e1282ead6e8b54a642"} Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.225124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7954db6464-ns5cf" event={"ID":"c0001988-feba-4afe-9068-071af12a6fd7","Type":"ContainerStarted","Data":"f0661b001f1e4aa05e793a6f1f5306c28f23ea2a63110c85c0439180d4cfb904"} Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.227138 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d528d011-c6fb-4786-8d66-1fc289bd91cc" (UID: "d528d011-c6fb-4786-8d66-1fc289bd91cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.236808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" event={"ID":"593624b1-1f23-4fdb-8b94-00837da810bc","Type":"ContainerStarted","Data":"2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7"} Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.236879 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.237480 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.253716 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhn82\" (UniqueName: \"kubernetes.io/projected/e1c5df5c-43af-4b40-8a2d-1db9b79a699b-kube-api-access-jhn82\") pod \"auto-csr-approver-29552608-5gj6v\" (UID: \"e1c5df5c-43af-4b40-8a2d-1db9b79a699b\") " pod="openshift-infra/auto-csr-approver-29552608-5gj6v" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.253898 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d528d011-c6fb-4786-8d66-1fc289bd91cc-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.253915 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.253928 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.253940 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d528d011-c6fb-4786-8d66-1fc289bd91cc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.253953 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnt6z\" (UniqueName: \"kubernetes.io/projected/d528d011-c6fb-4786-8d66-1fc289bd91cc-kube-api-access-pnt6z\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.253966 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.254111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data" (OuterVolumeSpecName: "config-data") pod "d528d011-c6fb-4786-8d66-1fc289bd91cc" (UID: "d528d011-c6fb-4786-8d66-1fc289bd91cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.269709 4743 scope.go:117] "RemoveContainer" containerID="d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.285265 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" podStartSLOduration=5.285242367 podStartE2EDuration="5.285242367s" podCreationTimestamp="2026-03-10 15:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:00.269615454 +0000 UTC m=+1344.976430202" watchObservedRunningTime="2026-03-10 15:28:00.285242367 +0000 UTC m=+1344.992057125" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.311926 4743 scope.go:117] "RemoveContainer" containerID="ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6" Mar 10 15:28:00 crc kubenswrapper[4743]: E0310 15:28:00.315299 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6\": container with ID starting with ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6 not found: ID does not exist" containerID="ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.315348 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6"} err="failed to get container status \"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6\": rpc error: code = NotFound desc = could not find container \"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6\": container with ID starting with ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6 not found: ID does not exist" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.315368 4743 scope.go:117] "RemoveContainer" containerID="d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13" Mar 10 15:28:00 crc kubenswrapper[4743]: E0310 15:28:00.319334 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13\": container with ID starting with d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13 not found: ID does not exist" containerID="d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.319383 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13"} err="failed to get container status \"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13\": rpc error: code = NotFound desc = could not find container \"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13\": container with ID starting with d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13 not found: ID does not exist" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.319409 4743 scope.go:117] "RemoveContainer" containerID="ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.324186 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6"} err="failed to get container status \"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6\": rpc error: code = NotFound desc = could not find container \"ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6\": container with ID starting with ff44275229a209531fdefc53c4f9447b2968a06d241fd9b8ecd2bbe6a593d2e6 not found: ID does not exist" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.324221 4743 scope.go:117] "RemoveContainer" containerID="d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.324733 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13"} err="failed to get container status \"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13\": rpc error: code = NotFound desc = could not find container \"d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13\": container with ID starting with d1ef7e6063bb595c52269d469dc4063fa417df09e61ca9183a48e85aef271d13 not found: ID does not exist" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.355990 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhn82\" (UniqueName: \"kubernetes.io/projected/e1c5df5c-43af-4b40-8a2d-1db9b79a699b-kube-api-access-jhn82\") pod \"auto-csr-approver-29552608-5gj6v\" (UID: \"e1c5df5c-43af-4b40-8a2d-1db9b79a699b\") " pod="openshift-infra/auto-csr-approver-29552608-5gj6v" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.356252 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528d011-c6fb-4786-8d66-1fc289bd91cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.382515 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.383416 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhn82\" (UniqueName: \"kubernetes.io/projected/e1c5df5c-43af-4b40-8a2d-1db9b79a699b-kube-api-access-jhn82\") pod \"auto-csr-approver-29552608-5gj6v\" (UID: \"e1c5df5c-43af-4b40-8a2d-1db9b79a699b\") " pod="openshift-infra/auto-csr-approver-29552608-5gj6v" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.508152 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.522454 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552608-5gj6v" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.638106 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.669930 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.691740 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.695617 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.706799 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.706804 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.707048 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.721595 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.756205 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.756305 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.774009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.774186 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mqfm\" (UniqueName: \"kubernetes.io/projected/aeb95b3f-66df-4abf-99e5-b18c24053075-kube-api-access-2mqfm\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.774278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-scripts\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.774458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-config-data-custom\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.779086 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb95b3f-66df-4abf-99e5-b18c24053075-logs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.779210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aeb95b3f-66df-4abf-99e5-b18c24053075-etc-machine-id\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.779393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-public-tls-certs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.779617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-config-data\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.779724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.862635 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-public-tls-certs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-config-data\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882271 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mqfm\" (UniqueName: \"kubernetes.io/projected/aeb95b3f-66df-4abf-99e5-b18c24053075-kube-api-access-2mqfm\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882293 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-scripts\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882354 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-config-data-custom\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882368 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb95b3f-66df-4abf-99e5-b18c24053075-logs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.882389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aeb95b3f-66df-4abf-99e5-b18c24053075-etc-machine-id\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.887793 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb95b3f-66df-4abf-99e5-b18c24053075-logs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.887856 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aeb95b3f-66df-4abf-99e5-b18c24053075-etc-machine-id\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.891981 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-public-tls-certs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.896636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-config-data-custom\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.897062 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-scripts\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.898267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-config-data\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.916426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.938158 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb95b3f-66df-4abf-99e5-b18c24053075-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.945523 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mqfm\" (UniqueName: \"kubernetes.io/projected/aeb95b3f-66df-4abf-99e5-b18c24053075-kube-api-access-2mqfm\") pod \"manila-api-0\" (UID: \"aeb95b3f-66df-4abf-99e5-b18c24053075\") " pod="openstack/manila-api-0" Mar 10 15:28:00 crc kubenswrapper[4743]: I0310 15:28:00.960252 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.026135 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.257452 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerStarted","Data":"650e5574e9bb87746a05a9434a5878c5d6bd44c0f3bb2fcd74afc13f777dcded"} Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.257961 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.261065 4743 generic.go:334] "Generic (PLEG): container finished" podID="cccf05c8-d4e8-4a1d-912f-5f4a37440ac7" containerID="c8e35c99a898ab5b18b13aae107bf529b1f8d349a368587867349ca074eddf3a" exitCode=0 Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.261123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-958fd895b-mxn2t" event={"ID":"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7","Type":"ContainerDied","Data":"c8e35c99a898ab5b18b13aae107bf529b1f8d349a368587867349ca074eddf3a"} Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.261149 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-958fd895b-mxn2t" event={"ID":"cccf05c8-d4e8-4a1d-912f-5f4a37440ac7","Type":"ContainerStarted","Data":"8cfe4a422796ab016a62de133ff8bf19ccfd7632ab21c1b603456e73bb8cfb8a"} Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.311231 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.713445134 podStartE2EDuration="8.311209304s" podCreationTimestamp="2026-03-10 15:27:53 +0000 UTC" firstStartedPulling="2026-03-10 15:27:55.198209292 +0000 UTC m=+1339.905024040" lastFinishedPulling="2026-03-10 15:28:00.795973472 +0000 UTC m=+1345.502788210" observedRunningTime="2026-03-10 15:28:01.280883365 +0000 UTC m=+1345.987698133" watchObservedRunningTime="2026-03-10 15:28:01.311209304 +0000 UTC m=+1346.018024052" Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.357870 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.384619 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.418021 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552608-5gj6v"] Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.535430 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.586716 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.732431 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 10 15:28:01 crc kubenswrapper[4743]: I0310 15:28:01.930300 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d528d011-c6fb-4786-8d66-1fc289bd91cc" path="/var/lib/kubelet/pods/d528d011-c6fb-4786-8d66-1fc289bd91cc/volumes" Mar 10 15:28:02 crc kubenswrapper[4743]: I0310 15:28:02.289170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552608-5gj6v" event={"ID":"e1c5df5c-43af-4b40-8a2d-1db9b79a699b","Type":"ContainerStarted","Data":"f35b493de8ae4656439ca459ece634b3b08c732530c135756db99a9f092ec956"} Mar 10 15:28:02 crc kubenswrapper[4743]: I0310 15:28:02.295948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aeb95b3f-66df-4abf-99e5-b18c24053075","Type":"ContainerStarted","Data":"64616075532de36bd9e097af7ad8cb5c9f321c04139946a2eb9c73c5c4ad9342"} Mar 10 15:28:02 crc kubenswrapper[4743]: I0310 15:28:02.296081 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerName="cinder-backup" containerID="cri-o://507a736943caae863597c5bbd129ead8e0aa5bcab55f2caa3adba073be5b44cd" gracePeriod=30 Mar 10 15:28:02 crc kubenswrapper[4743]: I0310 15:28:02.296204 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerName="probe" containerID="cri-o://688cab3c354d213efc05f8ff649dcfe221e23e51d386b88b7335489c46050c48" gracePeriod=30 Mar 10 15:28:02 crc kubenswrapper[4743]: I0310 15:28:02.296269 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerName="cinder-volume" containerID="cri-o://3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0" gracePeriod=30 Mar 10 15:28:02 crc kubenswrapper[4743]: I0310 15:28:02.296452 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerName="probe" containerID="cri-o://627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70" gracePeriod=30 Mar 10 15:28:02 crc kubenswrapper[4743]: I0310 15:28:02.296522 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerName="cinder-scheduler" containerID="cri-o://c2dae9e478ac1ef483d6362aadd56337b8bfa2da59946609f91a556a0a37d94c" gracePeriod=30 Mar 10 15:28:02 crc kubenswrapper[4743]: I0310 15:28:02.297096 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerName="probe" containerID="cri-o://129273effd7281d6673d014609e48e033f4188a95f53d8c08997f95894aefb88" gracePeriod=30 Mar 10 15:28:03 crc kubenswrapper[4743]: I0310 15:28:03.336058 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aeb95b3f-66df-4abf-99e5-b18c24053075","Type":"ContainerStarted","Data":"abcaf1cb1d09609b1c71be66312531d7278692e44df6764e908b15b9088b1cf0"} Mar 10 15:28:03 crc kubenswrapper[4743]: I0310 15:28:03.346615 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerID="3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0" exitCode=0 Mar 10 15:28:03 crc kubenswrapper[4743]: I0310 15:28:03.346864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d4c7db42-928a-4d49-95df-e2073ad24b21","Type":"ContainerDied","Data":"3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0"} Mar 10 15:28:03 crc kubenswrapper[4743]: I0310 15:28:03.361096 4743 generic.go:334] "Generic (PLEG): container finished" podID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerID="129273effd7281d6673d014609e48e033f4188a95f53d8c08997f95894aefb88" exitCode=0 Mar 10 15:28:03 crc kubenswrapper[4743]: I0310 15:28:03.361242 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"943d6458-faf7-4ed9-b883-51bfae20d07e","Type":"ContainerDied","Data":"129273effd7281d6673d014609e48e033f4188a95f53d8c08997f95894aefb88"} Mar 10 15:28:03 crc kubenswrapper[4743]: I0310 15:28:03.370648 4743 generic.go:334] "Generic (PLEG): container finished" podID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerID="688cab3c354d213efc05f8ff649dcfe221e23e51d386b88b7335489c46050c48" exitCode=0 Mar 10 15:28:03 crc kubenswrapper[4743]: I0310 15:28:03.370739 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2a12054f-0a1c-4294-8855-bcb45a1e3684","Type":"ContainerDied","Data":"688cab3c354d213efc05f8ff649dcfe221e23e51d386b88b7335489c46050c48"} Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.243632 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.390561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data-custom\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.390875 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-machine-id\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.390898 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-combined-ca-bundle\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.390991 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-brick\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391014 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-scripts\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391057 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rxk4\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-kube-api-access-8rxk4\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-sys\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391141 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-dev\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391195 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-nvme\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391203 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391220 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-cinder\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-lib-modules\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-iscsi\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391308 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-lib-cinder\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391333 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391383 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-run\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391417 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-ceph\") pod \"d4c7db42-928a-4d49-95df-e2073ad24b21\" (UID: \"d4c7db42-928a-4d49-95df-e2073ad24b21\") " Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.391952 4743 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.396476 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.402223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.402302 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.402322 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.402341 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-sys" (OuterVolumeSpecName: "sys") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.413218 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.413702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-dev" (OuterVolumeSpecName: "dev") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.415731 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-run" (OuterVolumeSpecName: "run") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.416026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.416057 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.435069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.435187 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-ceph" (OuterVolumeSpecName: "ceph") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.437040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-kube-api-access-8rxk4" (OuterVolumeSpecName: "kube-api-access-8rxk4") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "kube-api-access-8rxk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.437130 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-scripts" (OuterVolumeSpecName: "scripts") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.456378 4743 generic.go:334] "Generic (PLEG): container finished" podID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerID="507a736943caae863597c5bbd129ead8e0aa5bcab55f2caa3adba073be5b44cd" exitCode=0 Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.456469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2a12054f-0a1c-4294-8855-bcb45a1e3684","Type":"ContainerDied","Data":"507a736943caae863597c5bbd129ead8e0aa5bcab55f2caa3adba073be5b44cd"} Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495079 4743 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-sys\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495113 4743 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-dev\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495123 4743 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495135 4743 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495144 4743 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495151 4743 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495159 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495167 4743 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495176 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495184 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495194 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4c7db42-928a-4d49-95df-e2073ad24b21-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495202 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495210 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rxk4\" (UniqueName: \"kubernetes.io/projected/d4c7db42-928a-4d49-95df-e2073ad24b21-kube-api-access-8rxk4\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495559 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1c5df5c-43af-4b40-8a2d-1db9b79a699b" containerID="878a24c679391f55b2dd107b7b07c7f9c11c9a18caef59e9ad5d91074f785300" exitCode=0 Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.495659 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552608-5gj6v" event={"ID":"e1c5df5c-43af-4b40-8a2d-1db9b79a699b","Type":"ContainerDied","Data":"878a24c679391f55b2dd107b7b07c7f9c11c9a18caef59e9ad5d91074f785300"} Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.545209 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aeb95b3f-66df-4abf-99e5-b18c24053075","Type":"ContainerStarted","Data":"7ef246a353fc3908ad7bf8c703dc56c8f2b581e4d85cb50bdfde0427955be027"} Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.545829 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.562930 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.588608 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerID="627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70" exitCode=0 Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.588709 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d4c7db42-928a-4d49-95df-e2073ad24b21","Type":"ContainerDied","Data":"627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70"} Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.588744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d4c7db42-928a-4d49-95df-e2073ad24b21","Type":"ContainerDied","Data":"ec79dee920edc940df1a7b6bfd548fcabc44ad49acdf23bd874858b580c22022"} Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.588762 4743 scope.go:117] "RemoveContainer" containerID="627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.588923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.598240 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.605658 4743 generic.go:334] "Generic (PLEG): container finished" podID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerID="c2dae9e478ac1ef483d6362aadd56337b8bfa2da59946609f91a556a0a37d94c" exitCode=0 Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.605709 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"943d6458-faf7-4ed9-b883-51bfae20d07e","Type":"ContainerDied","Data":"c2dae9e478ac1ef483d6362aadd56337b8bfa2da59946609f91a556a0a37d94c"} Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.640574 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.640552508 podStartE2EDuration="4.640552508s" podCreationTimestamp="2026-03-10 15:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:04.596363006 +0000 UTC m=+1349.303177754" watchObservedRunningTime="2026-03-10 15:28:04.640552508 +0000 UTC m=+1349.347367256" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.820985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data" (OuterVolumeSpecName: "config-data") pod "d4c7db42-928a-4d49-95df-e2073ad24b21" (UID: "d4c7db42-928a-4d49-95df-e2073ad24b21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.834067 4743 scope.go:117] "RemoveContainer" containerID="3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.850464 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.912791 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c7db42-928a-4d49-95df-e2073ad24b21-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.995052 4743 scope.go:117] "RemoveContainer" containerID="627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70" Mar 10 15:28:04 crc kubenswrapper[4743]: I0310 15:28:04.995158 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: E0310 15:28:05.005216 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70\": container with ID starting with 627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70 not found: ID does not exist" containerID="627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.005282 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70"} err="failed to get container status \"627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70\": rpc error: code = NotFound desc = could not find container \"627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70\": container with ID starting with 627bf9f162ec9f678d32f67e12f6b86011adb28c3773c631a72647e896f34b70 not found: ID does not exist" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.005321 4743 scope.go:117] "RemoveContainer" containerID="3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0" Mar 10 15:28:05 crc kubenswrapper[4743]: E0310 15:28:05.010687 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0\": container with ID starting with 3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0 not found: ID does not exist" containerID="3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.010728 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0"} err="failed to get container status \"3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0\": rpc error: code = NotFound desc = could not find container \"3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0\": container with ID starting with 3bb8f1a7daf6c67dd2bc7ee32849302fba482b812d2ae96cf560879af91476b0 not found: ID does not exist" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016222 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-iscsi\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016291 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data-custom\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016368 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016406 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-dev\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-ceph\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016454 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-cinder\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016484 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-lib-modules\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016523 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-combined-ca-bundle\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016633 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-lib-cinder\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-sys\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-nvme\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016750 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-scripts\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016767 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-machine-id\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016826 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-brick\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.016940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-sys" (OuterVolumeSpecName: "sys") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017267 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-run\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017307 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prjwd\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-kube-api-access-prjwd\") pod \"2a12054f-0a1c-4294-8855-bcb45a1e3684\" (UID: \"2a12054f-0a1c-4294-8855-bcb45a1e3684\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017336 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017365 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-dev" (OuterVolumeSpecName: "dev") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017902 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017922 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017949 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-run" (OuterVolumeSpecName: "run") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017981 4743 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-sys\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.018001 4743 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-dev\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.018010 4743 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.017983 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.018002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.018920 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.020084 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.031346 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-kube-api-access-prjwd" (OuterVolumeSpecName: "kube-api-access-prjwd") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "kube-api-access-prjwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.041022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-ceph" (OuterVolumeSpecName: "ceph") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.041041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.041142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-scripts" (OuterVolumeSpecName: "scripts") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.051294 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: E0310 15:28:05.051793 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerName="cinder-volume" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.051979 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerName="cinder-volume" Mar 10 15:28:05 crc kubenswrapper[4743]: E0310 15:28:05.051992 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.052002 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: E0310 15:28:05.052039 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.052046 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: E0310 15:28:05.052059 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerName="cinder-backup" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.052068 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerName="cinder-backup" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.052263 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.052273 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" containerName="cinder-backup" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.052284 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.052298 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" containerName="cinder-volume" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.053469 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.058866 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.102913 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126824 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126863 4743 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126875 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126886 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126903 4743 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126912 4743 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126922 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prjwd\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-kube-api-access-prjwd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126937 4743 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126947 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126959 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2a12054f-0a1c-4294-8855-bcb45a1e3684-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.126968 4743 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2a12054f-0a1c-4294-8855-bcb45a1e3684-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.152717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.222177 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229529 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-dev\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229620 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229653 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwmtj\" (UniqueName: \"kubernetes.io/projected/df8b86ca-f690-4581-a74f-ab245f3b2479-kube-api-access-kwmtj\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229682 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-run\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229742 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229773 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.229789 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.230121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.230139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.230164 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df8b86ca-f690-4581-a74f-ab245f3b2479-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.230183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.230408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.230457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.230548 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-sys\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.230591 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.248157 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data" (OuterVolumeSpecName: "config-data") pod "2a12054f-0a1c-4294-8855-bcb45a1e3684" (UID: "2a12054f-0a1c-4294-8855-bcb45a1e3684"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.331425 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-combined-ca-bundle\") pod \"943d6458-faf7-4ed9-b883-51bfae20d07e\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.331532 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/943d6458-faf7-4ed9-b883-51bfae20d07e-etc-machine-id\") pod \"943d6458-faf7-4ed9-b883-51bfae20d07e\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.331633 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhf8w\" (UniqueName: \"kubernetes.io/projected/943d6458-faf7-4ed9-b883-51bfae20d07e-kube-api-access-lhf8w\") pod \"943d6458-faf7-4ed9-b883-51bfae20d07e\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.331715 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data\") pod \"943d6458-faf7-4ed9-b883-51bfae20d07e\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.331745 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data-custom\") pod \"943d6458-faf7-4ed9-b883-51bfae20d07e\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.331870 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-scripts\") pod \"943d6458-faf7-4ed9-b883-51bfae20d07e\" (UID: \"943d6458-faf7-4ed9-b883-51bfae20d07e\") " Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332383 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-sys\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332416 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-dev\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332459 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwmtj\" (UniqueName: \"kubernetes.io/projected/df8b86ca-f690-4581-a74f-ab245f3b2479-kube-api-access-kwmtj\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332503 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-run\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332590 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332607 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332660 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df8b86ca-f690-4581-a74f-ab245f3b2479-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332802 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a12054f-0a1c-4294-8855-bcb45a1e3684-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332922 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/943d6458-faf7-4ed9-b883-51bfae20d07e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "943d6458-faf7-4ed9-b883-51bfae20d07e" (UID: "943d6458-faf7-4ed9-b883-51bfae20d07e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.332975 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.333262 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-dev\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.333429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.333999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.334048 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.335488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-run\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.335580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.335603 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-sys\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.335630 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.335674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df8b86ca-f690-4581-a74f-ab245f3b2479-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.338764 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.339241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.341472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.345023 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-scripts" (OuterVolumeSpecName: "scripts") pod "943d6458-faf7-4ed9-b883-51bfae20d07e" (UID: "943d6458-faf7-4ed9-b883-51bfae20d07e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.347654 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943d6458-faf7-4ed9-b883-51bfae20d07e-kube-api-access-lhf8w" (OuterVolumeSpecName: "kube-api-access-lhf8w") pod "943d6458-faf7-4ed9-b883-51bfae20d07e" (UID: "943d6458-faf7-4ed9-b883-51bfae20d07e"). InnerVolumeSpecName "kube-api-access-lhf8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.348233 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "943d6458-faf7-4ed9-b883-51bfae20d07e" (UID: "943d6458-faf7-4ed9-b883-51bfae20d07e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.351522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwmtj\" (UniqueName: \"kubernetes.io/projected/df8b86ca-f690-4581-a74f-ab245f3b2479-kube-api-access-kwmtj\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.352053 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8b86ca-f690-4581-a74f-ab245f3b2479-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.355319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df8b86ca-f690-4581-a74f-ab245f3b2479-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"df8b86ca-f690-4581-a74f-ab245f3b2479\") " pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.362070 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5bbb89db44-8df8j" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.389973 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.424522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.434851 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/943d6458-faf7-4ed9-b883-51bfae20d07e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.434885 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhf8w\" (UniqueName: \"kubernetes.io/projected/943d6458-faf7-4ed9-b883-51bfae20d07e-kube-api-access-lhf8w\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.434902 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.434912 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.453478 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "943d6458-faf7-4ed9-b883-51bfae20d07e" (UID: "943d6458-faf7-4ed9-b883-51bfae20d07e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.537568 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.568011 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data" (OuterVolumeSpecName: "config-data") pod "943d6458-faf7-4ed9-b883-51bfae20d07e" (UID: "943d6458-faf7-4ed9-b883-51bfae20d07e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.641303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"943d6458-faf7-4ed9-b883-51bfae20d07e","Type":"ContainerDied","Data":"616fd3bfb3ccc3129f81ec25307ccde2708427e2e3f9640c37f246993d1fb37c"} Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.641362 4743 scope.go:117] "RemoveContainer" containerID="129273effd7281d6673d014609e48e033f4188a95f53d8c08997f95894aefb88" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.641493 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.649477 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d6458-faf7-4ed9-b883-51bfae20d07e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.653760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2a12054f-0a1c-4294-8855-bcb45a1e3684","Type":"ContainerDied","Data":"6df130ee9f012ae80f658429dbc96ea8d85430dc769ff75d2c7d52d7d68af962"} Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.654470 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.709956 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.721747 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.734376 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.751551 4743 scope.go:117] "RemoveContainer" containerID="c2dae9e478ac1ef483d6362aadd56337b8bfa2da59946609f91a556a0a37d94c" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.752008 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.762928 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: E0310 15:28:05.763503 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.763531 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: E0310 15:28:05.763577 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerName="cinder-scheduler" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.763588 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerName="cinder-scheduler" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.763977 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerName="cinder-scheduler" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.764005 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" containerName="probe" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.765728 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.772219 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.776403 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.795878 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.796805 4743 scope.go:117] "RemoveContainer" containerID="688cab3c354d213efc05f8ff649dcfe221e23e51d386b88b7335489c46050c48" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.797603 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.801341 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.812384 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.857209 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.857287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40933ebc-541e-4ae6-8280-372d05c43c3c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.857351 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.857390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-scripts\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.857592 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snvxv\" (UniqueName: \"kubernetes.io/projected/40933ebc-541e-4ae6-8280-372d05c43c3c-kube-api-access-snvxv\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.857677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-config-data\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.862918 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.868715 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vmpwv"] Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.868967 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" podUID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" containerName="dnsmasq-dns" containerID="cri-o://e0d52e7ff40eba6b44a945eb6d4ac6091b5525650d674c73187ec2dab5c70c36" gracePeriod=10 Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.916940 4743 scope.go:117] "RemoveContainer" containerID="507a736943caae863597c5bbd129ead8e0aa5bcab55f2caa3adba073be5b44cd" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.959557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960426 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40933ebc-541e-4ae6-8280-372d05c43c3c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960606 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960674 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-scripts\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-config-data\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/42310a41-71eb-4eb8-bba6-938d1307270c-ceph\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-dev\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960893 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-lib-modules\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.960991 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-sys\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.961018 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.961054 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.961086 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snvxv\" (UniqueName: \"kubernetes.io/projected/40933ebc-541e-4ae6-8280-372d05c43c3c-kube-api-access-snvxv\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.961185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-run\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.961227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-config-data\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.961259 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.961292 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s76rf\" (UniqueName: \"kubernetes.io/projected/42310a41-71eb-4eb8-bba6-938d1307270c-kube-api-access-s76rf\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.961386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-scripts\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.973015 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40933ebc-541e-4ae6-8280-372d05c43c3c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.978939 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-scripts\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.989585 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:05 crc kubenswrapper[4743]: I0310 15:28:05.995278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.005742 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40933ebc-541e-4ae6-8280-372d05c43c3c-config-data\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.030524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snvxv\" (UniqueName: \"kubernetes.io/projected/40933ebc-541e-4ae6-8280-372d05c43c3c-kube-api-access-snvxv\") pod \"cinder-scheduler-0\" (UID: \"40933ebc-541e-4ae6-8280-372d05c43c3c\") " pod="openstack/cinder-scheduler-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.033885 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a12054f-0a1c-4294-8855-bcb45a1e3684" path="/var/lib/kubelet/pods/2a12054f-0a1c-4294-8855-bcb45a1e3684/volumes" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.034701 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943d6458-faf7-4ed9-b883-51bfae20d07e" path="/var/lib/kubelet/pods/943d6458-faf7-4ed9-b883-51bfae20d07e/volumes" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/42310a41-71eb-4eb8-bba6-938d1307270c-ceph\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-dev\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065279 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-lib-modules\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-sys\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065374 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-run\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065429 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s76rf\" (UniqueName: \"kubernetes.io/projected/42310a41-71eb-4eb8-bba6-938d1307270c-kube-api-access-s76rf\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-scripts\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065504 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065541 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065597 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.065623 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-config-data\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.066467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.066519 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-run\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.066798 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-dev\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.068272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-sys\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.068439 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-lib-modules\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.068530 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.068947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.069034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.069059 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.069220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/42310a41-71eb-4eb8-bba6-938d1307270c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.072100 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4c7db42-928a-4d49-95df-e2073ad24b21" path="/var/lib/kubelet/pods/d4c7db42-928a-4d49-95df-e2073ad24b21/volumes" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.080346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-config-data\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.085491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.088802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-scripts\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.091452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/42310a41-71eb-4eb8-bba6-938d1307270c-ceph\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.111475 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s76rf\" (UniqueName: \"kubernetes.io/projected/42310a41-71eb-4eb8-bba6-938d1307270c-kube-api-access-s76rf\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.111871 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.116091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42310a41-71eb-4eb8-bba6-938d1307270c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"42310a41-71eb-4eb8-bba6-938d1307270c\") " pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.136556 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.298779 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552608-5gj6v" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.383425 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c5df5c-43af-4b40-8a2d-1db9b79a699b-kube-api-access-jhn82" (OuterVolumeSpecName: "kube-api-access-jhn82") pod "e1c5df5c-43af-4b40-8a2d-1db9b79a699b" (UID: "e1c5df5c-43af-4b40-8a2d-1db9b79a699b"). InnerVolumeSpecName "kube-api-access-jhn82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.390672 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhn82\" (UniqueName: \"kubernetes.io/projected/e1c5df5c-43af-4b40-8a2d-1db9b79a699b-kube-api-access-jhn82\") pod \"e1c5df5c-43af-4b40-8a2d-1db9b79a699b\" (UID: \"e1c5df5c-43af-4b40-8a2d-1db9b79a699b\") " Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.392280 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhn82\" (UniqueName: \"kubernetes.io/projected/e1c5df5c-43af-4b40-8a2d-1db9b79a699b-kube-api-access-jhn82\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.459898 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.768232 4743 generic.go:334] "Generic (PLEG): container finished" podID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" containerID="e0d52e7ff40eba6b44a945eb6d4ac6091b5525650d674c73187ec2dab5c70c36" exitCode=0 Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.768707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" event={"ID":"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6","Type":"ContainerDied","Data":"e0d52e7ff40eba6b44a945eb6d4ac6091b5525650d674c73187ec2dab5c70c36"} Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.768750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" event={"ID":"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6","Type":"ContainerDied","Data":"6c0644b1b9d0286f00e3cbd70e7dcaffd55fa590bdb16cd75ac58d627bd6fa6f"} Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.768763 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0644b1b9d0286f00e3cbd70e7dcaffd55fa590bdb16cd75ac58d627bd6fa6f" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.772073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"df8b86ca-f690-4581-a74f-ab245f3b2479","Type":"ContainerStarted","Data":"a8625ed351a27cff3a41a68fa907fd8f3af4dac0f0e5f0b1c37cac98562a4ab6"} Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.783516 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552608-5gj6v" event={"ID":"e1c5df5c-43af-4b40-8a2d-1db9b79a699b","Type":"ContainerDied","Data":"f35b493de8ae4656439ca459ece634b3b08c732530c135756db99a9f092ec956"} Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.783559 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f35b493de8ae4656439ca459ece634b3b08c732530c135756db99a9f092ec956" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.783632 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552608-5gj6v" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.791850 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.869569 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.912721 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-svc\") pod \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.913293 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-config\") pod \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.913321 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-nb\") pod \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.913398 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqvmm\" (UniqueName: \"kubernetes.io/projected/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-kube-api-access-gqvmm\") pod \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.913421 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-sb\") pod \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.913471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-swift-storage-0\") pod \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\" (UID: \"4dc5ff5a-8a62-472d-8bc9-9bae76569ff6\") " Mar 10 15:28:06 crc kubenswrapper[4743]: I0310 15:28:06.938929 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-kube-api-access-gqvmm" (OuterVolumeSpecName: "kube-api-access-gqvmm") pod "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" (UID: "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6"). InnerVolumeSpecName "kube-api-access-gqvmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.016085 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqvmm\" (UniqueName: \"kubernetes.io/projected/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-kube-api-access-gqvmm\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.053744 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" (UID: "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.091323 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-config" (OuterVolumeSpecName: "config") pod "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" (UID: "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.094908 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" (UID: "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.110495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" (UID: "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.115405 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" (UID: "4dc5ff5a-8a62-472d-8bc9-9bae76569ff6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.119309 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.119342 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.119353 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.119364 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.119374 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.308943 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.410875 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552602-msfrq"] Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.435071 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552602-msfrq"] Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.804911 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"df8b86ca-f690-4581-a74f-ab245f3b2479","Type":"ContainerStarted","Data":"12a9e32c7a9f7a7c5b2cdce35dd3c706867a75f7e44b825b1fe29962abef2d9d"} Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.805300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"df8b86ca-f690-4581-a74f-ab245f3b2479","Type":"ContainerStarted","Data":"3b36faf319868f14f08bc061fae8c9300be246584958e18ca2bfee59ba83e753"} Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.808200 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"40933ebc-541e-4ae6-8280-372d05c43c3c","Type":"ContainerStarted","Data":"889068f1f343979288573c5869d554a6148333e543826d51c0f4eb795da8c5fe"} Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.813914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"42310a41-71eb-4eb8-bba6-938d1307270c","Type":"ContainerStarted","Data":"45c93f988c41adf72f09310dc9da7237eee633806ad85849b99380caac9100c9"} Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.813944 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vmpwv" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.834872 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.834849627 podStartE2EDuration="3.834849627s" podCreationTimestamp="2026-03-10 15:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:07.827601042 +0000 UTC m=+1352.534415790" watchObservedRunningTime="2026-03-10 15:28:07.834849627 +0000 UTC m=+1352.541664375" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.862581 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vmpwv"] Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.870119 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vmpwv"] Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.935671 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" path="/var/lib/kubelet/pods/4dc5ff5a-8a62-472d-8bc9-9bae76569ff6/volumes" Mar 10 15:28:07 crc kubenswrapper[4743]: I0310 15:28:07.936312 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92e45e9-8db6-4fde-aca3-2d0c5024d77e" path="/var/lib/kubelet/pods/b92e45e9-8db6-4fde-aca3-2d0c5024d77e/volumes" Mar 10 15:28:08 crc kubenswrapper[4743]: I0310 15:28:08.850379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"40933ebc-541e-4ae6-8280-372d05c43c3c","Type":"ContainerStarted","Data":"e19395ffed2153efbdfd7cd52942c6fda3efbf7b6fa52a6b660f099dd4a8025a"} Mar 10 15:28:08 crc kubenswrapper[4743]: I0310 15:28:08.870757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"42310a41-71eb-4eb8-bba6-938d1307270c","Type":"ContainerStarted","Data":"c23d881e9d80d668473b74a7d3d591981a1a5193fb5ec2ecaf6a0ded10d67789"} Mar 10 15:28:08 crc kubenswrapper[4743]: I0310 15:28:08.870840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"42310a41-71eb-4eb8-bba6-938d1307270c","Type":"ContainerStarted","Data":"fdf8c12f801f942325bbba057eb58116689406ba12cdaf334dee6dd319046979"} Mar 10 15:28:08 crc kubenswrapper[4743]: I0310 15:28:08.900735 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.900711315 podStartE2EDuration="3.900711315s" podCreationTimestamp="2026-03-10 15:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:08.89383293 +0000 UTC m=+1353.600647678" watchObservedRunningTime="2026-03-10 15:28:08.900711315 +0000 UTC m=+1353.607526073" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.885147 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"40933ebc-541e-4ae6-8280-372d05c43c3c","Type":"ContainerStarted","Data":"0feebf711f8b7f7aec2c1b138d106033833e1fc9e663fe50ca7e13cca6ff5378"} Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.917558 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.917533634 podStartE2EDuration="4.917533634s" podCreationTimestamp="2026-03-10 15:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:09.903172927 +0000 UTC m=+1354.609987685" watchObservedRunningTime="2026-03-10 15:28:09.917533634 +0000 UTC m=+1354.624348382" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.952290 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 15:28:09 crc kubenswrapper[4743]: E0310 15:28:09.952801 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" containerName="dnsmasq-dns" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.952838 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" containerName="dnsmasq-dns" Mar 10 15:28:09 crc kubenswrapper[4743]: E0310 15:28:09.952864 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" containerName="init" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.952870 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" containerName="init" Mar 10 15:28:09 crc kubenswrapper[4743]: E0310 15:28:09.952886 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c5df5c-43af-4b40-8a2d-1db9b79a699b" containerName="oc" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.952893 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c5df5c-43af-4b40-8a2d-1db9b79a699b" containerName="oc" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.953083 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c5df5c-43af-4b40-8a2d-1db9b79a699b" containerName="oc" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.953101 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc5ff5a-8a62-472d-8bc9-9bae76569ff6" containerName="dnsmasq-dns" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.953806 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.956951 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-p7rjh" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.957213 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.957399 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 15:28:09 crc kubenswrapper[4743]: I0310 15:28:09.966001 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.098252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config-secret\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.098390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b2sg\" (UniqueName: \"kubernetes.io/projected/a14203e7-ab92-4d5b-86af-d253ddd6b215-kube-api-access-5b2sg\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.098473 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.098502 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.203107 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.203160 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.203253 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config-secret\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.203350 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b2sg\" (UniqueName: \"kubernetes.io/projected/a14203e7-ab92-4d5b-86af-d253ddd6b215-kube-api-access-5b2sg\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.204116 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.209258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config-secret\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.210554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.235680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b2sg\" (UniqueName: \"kubernetes.io/projected/a14203e7-ab92-4d5b-86af-d253ddd6b215-kube-api-access-5b2sg\") pod \"openstackclient\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.286534 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.302374 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.339568 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.357036 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.358540 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.369714 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.409187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da9eebbd-7a82-441b-8ca6-14657357a1f0-openstack-config-secret\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.409773 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6vst\" (UniqueName: \"kubernetes.io/projected/da9eebbd-7a82-441b-8ca6-14657357a1f0-kube-api-access-h6vst\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.410246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9eebbd-7a82-441b-8ca6-14657357a1f0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.410430 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da9eebbd-7a82-441b-8ca6-14657357a1f0-openstack-config\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.429340 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.512693 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9eebbd-7a82-441b-8ca6-14657357a1f0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.512860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da9eebbd-7a82-441b-8ca6-14657357a1f0-openstack-config\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.512899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da9eebbd-7a82-441b-8ca6-14657357a1f0-openstack-config-secret\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.512919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6vst\" (UniqueName: \"kubernetes.io/projected/da9eebbd-7a82-441b-8ca6-14657357a1f0-kube-api-access-h6vst\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.513780 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da9eebbd-7a82-441b-8ca6-14657357a1f0-openstack-config\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.521253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da9eebbd-7a82-441b-8ca6-14657357a1f0-openstack-config-secret\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.534004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9eebbd-7a82-441b-8ca6-14657357a1f0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.541491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6vst\" (UniqueName: \"kubernetes.io/projected/da9eebbd-7a82-441b-8ca6-14657357a1f0-kube-api-access-h6vst\") pod \"openstackclient\" (UID: \"da9eebbd-7a82-441b-8ca6-14657357a1f0\") " pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.703439 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.749033 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.831255 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:28:10 crc kubenswrapper[4743]: I0310 15:28:10.831717 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:28:10 crc kubenswrapper[4743]: W0310 15:28:10.996801 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod943d6458_faf7_4ed9_b883_51bfae20d07e.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod943d6458_faf7_4ed9_b883_51bfae20d07e.slice: no such file or directory Mar 10 15:28:10 crc kubenswrapper[4743]: W0310 15:28:10.996892 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb57920_f14f_4e98_bea5_08e3aa17ffb1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb57920_f14f_4e98_bea5_08e3aa17ffb1.slice: no such file or directory Mar 10 15:28:10 crc kubenswrapper[4743]: W0310 15:28:10.996931 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a12054f_0a1c_4294_8855_bcb45a1e3684.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a12054f_0a1c_4294_8855_bcb45a1e3684.slice: no such file or directory Mar 10 15:28:10 crc kubenswrapper[4743]: W0310 15:28:10.996958 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4c7db42_928a_4d49_95df_e2073ad24b21.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4c7db42_928a_4d49_95df_e2073ad24b21.slice: no such file or directory Mar 10 15:28:11 crc kubenswrapper[4743]: I0310 15:28:11.111902 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 15:28:11 crc kubenswrapper[4743]: I0310 15:28:11.137039 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 10 15:28:11 crc kubenswrapper[4743]: E0310 15:28:11.241353 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cff04bc_a0b8_4155_8407_4f8253faa9e3.slice/crio-2eec530f630fcd1bb36338a94574510858c4a559aa4bb1813f6b72cc91a043a3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ee9c60_c790_40a1_816a_4152f87c16e0.slice/crio-conmon-2a095a8acf8c8cc66bb6e09146ffbd5e9facaf71496efad0a082b57c3af62541.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82ee9c60_c790_40a1_816a_4152f87c16e0.slice/crio-2a095a8acf8c8cc66bb6e09146ffbd5e9facaf71496efad0a082b57c3af62541.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e9bcd21_9034_46e7_a0ac_b2da56fa37f9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f942785_954d_4441_ac29_69e7b65ead94.slice/crio-conmon-a44b0ab62e898d54d43645d8c85eb8aaa2e9faaa6c8a1ebe1d196220ca7d6e80.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68a99e3b_6d76_485c_b284_5f275ba9bbef.slice/crio-8ffbb79c1ba7553c23c09ecc5886e0f9abbe9773800fc7878bbf8fcde3f4c2de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68a99e3b_6d76_485c_b284_5f275ba9bbef.slice/crio-conmon-8ffbb79c1ba7553c23c09ecc5886e0f9abbe9773800fc7878bbf8fcde3f4c2de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f36e37_4d73_48c9_a64b_810a95ed7bac.slice/crio-conmon-38cb9d98a25a82ad345c96bfd981bf7e7b4553c04e203f99a2a717bb6314dd7d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f36e37_4d73_48c9_a64b_810a95ed7bac.slice/crio-conmon-0a0d1f7f3393d7f31400370f146d4299b8f97dde9d368b1123819a2d7c31d935.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f36e37_4d73_48c9_a64b_810a95ed7bac.slice/crio-38cb9d98a25a82ad345c96bfd981bf7e7b4553c04e203f99a2a717bb6314dd7d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68a99e3b_6d76_485c_b284_5f275ba9bbef.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a5acc3b_0431_490e_b3c8_3b2ffa682f8b.slice/crio-68119a62955bb4be3b995ec2294bc6a9da4607d114cadc47e5a26edfc51ddbb3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d51c7ac_111d_46e8_903f_01f29e4221ac.slice/crio-6776dd2441f62d1ab751bcf7bc9237313b93fa8dd5929cf0e5f7477e3ae97d04.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f36e37_4d73_48c9_a64b_810a95ed7bac.slice/crio-0a0d1f7f3393d7f31400370f146d4299b8f97dde9d368b1123819a2d7c31d935.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68a99e3b_6d76_485c_b284_5f275ba9bbef.slice/crio-cef2e7c147540f7b19254f5416cdd113250b631982a8a3fab85e9d38670a1a08\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a5acc3b_0431_490e_b3c8_3b2ffa682f8b.slice/crio-conmon-3690d14ef39973676eb199ac0c743ebaacea85a3090a411e71de0ba1a770ca3e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a5acc3b_0431_490e_b3c8_3b2ffa682f8b.slice/crio-conmon-68119a62955bb4be3b995ec2294bc6a9da4607d114cadc47e5a26edfc51ddbb3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cff04bc_a0b8_4155_8407_4f8253faa9e3.slice/crio-conmon-2eec530f630fcd1bb36338a94574510858c4a559aa4bb1813f6b72cc91a043a3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e9bcd21_9034_46e7_a0ac_b2da56fa37f9.slice/crio-adf4a2d964ccc56ef771f6302a0f67e471f6a714962f11a3f9d70a8aa5217518\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a5acc3b_0431_490e_b3c8_3b2ffa682f8b.slice/crio-3690d14ef39973676eb199ac0c743ebaacea85a3090a411e71de0ba1a770ca3e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d51c7ac_111d_46e8_903f_01f29e4221ac.slice/crio-conmon-6776dd2441f62d1ab751bcf7bc9237313b93fa8dd5929cf0e5f7477e3ae97d04.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:28:11 crc kubenswrapper[4743]: I0310 15:28:11.921066 4743 generic.go:334] "Generic (PLEG): container finished" podID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerID="ce60b74ef61fbf223c1bc21dab3d5d62cd5f5056838b8d940cf9e51fcdd94d94" exitCode=137 Mar 10 15:28:11 crc kubenswrapper[4743]: I0310 15:28:11.927165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" event={"ID":"0cff04bc-a0b8-4155-8407-4f8253faa9e3","Type":"ContainerDied","Data":"ce60b74ef61fbf223c1bc21dab3d5d62cd5f5056838b8d940cf9e51fcdd94d94"} Mar 10 15:28:11 crc kubenswrapper[4743]: I0310 15:28:11.927828 4743 generic.go:334] "Generic (PLEG): container finished" podID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerID="8b872ebecc6a2a29bc444153e8f498acea78073dee93b1d2f2eeb719777ad36b" exitCode=137 Mar 10 15:28:11 crc kubenswrapper[4743]: I0310 15:28:11.928674 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fd79b999-7w7qq" event={"ID":"82ee9c60-c790-40a1-816a-4152f87c16e0","Type":"ContainerDied","Data":"8b872ebecc6a2a29bc444153e8f498acea78073dee93b1d2f2eeb719777ad36b"} Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.271855 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7457b8c8b7-rzp6b"] Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.273795 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.283610 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.283771 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.283923 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.294227 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7457b8c8b7-rzp6b"] Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.304034 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-config-data\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.304101 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-run-httpd\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.304182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-internal-tls-certs\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.304210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-etc-swift\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.304256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8szbn\" (UniqueName: \"kubernetes.io/projected/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-kube-api-access-8szbn\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.304289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-log-httpd\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.304318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-combined-ca-bundle\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.304359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-public-tls-certs\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.409514 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-internal-tls-certs\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.409575 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-etc-swift\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.409614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8szbn\" (UniqueName: \"kubernetes.io/projected/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-kube-api-access-8szbn\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.409641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-log-httpd\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.409665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-combined-ca-bundle\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.409697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-public-tls-certs\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.409771 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-config-data\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.409804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-run-httpd\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.410400 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-run-httpd\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.416689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-etc-swift\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.417105 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-public-tls-certs\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.421470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-log-httpd\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.424427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-internal-tls-certs\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.424674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-combined-ca-bundle\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.425873 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-config-data\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.439572 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8szbn\" (UniqueName: \"kubernetes.io/projected/5d2c4fc9-7b3d-457e-af7d-52e1cda83b53-kube-api-access-8szbn\") pod \"swift-proxy-7457b8c8b7-rzp6b\" (UID: \"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53\") " pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.451554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:14 crc kubenswrapper[4743]: E0310 15:28:14.606004 4743 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:28:14 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_a14203e7-ab92-4d5b-86af-d253ddd6b215_0(c7e8295bba83620eaaccbcf0f020c19b74dd9914abb400b7b816e428ff70ebbd): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c7e8295bba83620eaaccbcf0f020c19b74dd9914abb400b7b816e428ff70ebbd" Netns:"/var/run/netns/a3f91e28-1695-40bf-8a11-b328965a761a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=c7e8295bba83620eaaccbcf0f020c19b74dd9914abb400b7b816e428ff70ebbd;K8S_POD_UID=a14203e7-ab92-4d5b-86af-d253ddd6b215" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/a14203e7-ab92-4d5b-86af-d253ddd6b215]: expected pod UID "a14203e7-ab92-4d5b-86af-d253ddd6b215" but got "da9eebbd-7a82-441b-8ca6-14657357a1f0" from Kube API Mar 10 15:28:14 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:28:14 crc kubenswrapper[4743]: > Mar 10 15:28:14 crc kubenswrapper[4743]: E0310 15:28:14.606263 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:28:14 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_a14203e7-ab92-4d5b-86af-d253ddd6b215_0(c7e8295bba83620eaaccbcf0f020c19b74dd9914abb400b7b816e428ff70ebbd): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c7e8295bba83620eaaccbcf0f020c19b74dd9914abb400b7b816e428ff70ebbd" Netns:"/var/run/netns/a3f91e28-1695-40bf-8a11-b328965a761a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=c7e8295bba83620eaaccbcf0f020c19b74dd9914abb400b7b816e428ff70ebbd;K8S_POD_UID=a14203e7-ab92-4d5b-86af-d253ddd6b215" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/a14203e7-ab92-4d5b-86af-d253ddd6b215]: expected pod UID "a14203e7-ab92-4d5b-86af-d253ddd6b215" but got "da9eebbd-7a82-441b-8ca6-14657357a1f0" from Kube API Mar 10 15:28:14 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:28:14 crc kubenswrapper[4743]: > pod="openstack/openstackclient" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.661383 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.715937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m8bf\" (UniqueName: \"kubernetes.io/projected/0cff04bc-a0b8-4155-8407-4f8253faa9e3-kube-api-access-5m8bf\") pod \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.716089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data\") pod \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.716127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-combined-ca-bundle\") pod \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.716247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data-custom\") pod \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.716281 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cff04bc-a0b8-4155-8407-4f8253faa9e3-logs\") pod \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\" (UID: \"0cff04bc-a0b8-4155-8407-4f8253faa9e3\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.717608 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cff04bc-a0b8-4155-8407-4f8253faa9e3-logs" (OuterVolumeSpecName: "logs") pod "0cff04bc-a0b8-4155-8407-4f8253faa9e3" (UID: "0cff04bc-a0b8-4155-8407-4f8253faa9e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.729050 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cff04bc-a0b8-4155-8407-4f8253faa9e3-kube-api-access-5m8bf" (OuterVolumeSpecName: "kube-api-access-5m8bf") pod "0cff04bc-a0b8-4155-8407-4f8253faa9e3" (UID: "0cff04bc-a0b8-4155-8407-4f8253faa9e3"). InnerVolumeSpecName "kube-api-access-5m8bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.731965 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0cff04bc-a0b8-4155-8407-4f8253faa9e3" (UID: "0cff04bc-a0b8-4155-8407-4f8253faa9e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.754947 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.778961 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cff04bc-a0b8-4155-8407-4f8253faa9e3" (UID: "0cff04bc-a0b8-4155-8407-4f8253faa9e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.819383 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data\") pod \"82ee9c60-c790-40a1-816a-4152f87c16e0\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.819530 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82ee9c60-c790-40a1-816a-4152f87c16e0-logs\") pod \"82ee9c60-c790-40a1-816a-4152f87c16e0\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.819555 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data-custom\") pod \"82ee9c60-c790-40a1-816a-4152f87c16e0\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.819633 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sww8\" (UniqueName: \"kubernetes.io/projected/82ee9c60-c790-40a1-816a-4152f87c16e0-kube-api-access-2sww8\") pod \"82ee9c60-c790-40a1-816a-4152f87c16e0\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.819746 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-combined-ca-bundle\") pod \"82ee9c60-c790-40a1-816a-4152f87c16e0\" (UID: \"82ee9c60-c790-40a1-816a-4152f87c16e0\") " Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.820221 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m8bf\" (UniqueName: \"kubernetes.io/projected/0cff04bc-a0b8-4155-8407-4f8253faa9e3-kube-api-access-5m8bf\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.820240 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.820251 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.820261 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cff04bc-a0b8-4155-8407-4f8253faa9e3-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.820980 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82ee9c60-c790-40a1-816a-4152f87c16e0-logs" (OuterVolumeSpecName: "logs") pod "82ee9c60-c790-40a1-816a-4152f87c16e0" (UID: "82ee9c60-c790-40a1-816a-4152f87c16e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.845434 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ee9c60-c790-40a1-816a-4152f87c16e0-kube-api-access-2sww8" (OuterVolumeSpecName: "kube-api-access-2sww8") pod "82ee9c60-c790-40a1-816a-4152f87c16e0" (UID: "82ee9c60-c790-40a1-816a-4152f87c16e0"). InnerVolumeSpecName "kube-api-access-2sww8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.849262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "82ee9c60-c790-40a1-816a-4152f87c16e0" (UID: "82ee9c60-c790-40a1-816a-4152f87c16e0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.855123 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56f8646897-kmnvw" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.874806 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82ee9c60-c790-40a1-816a-4152f87c16e0" (UID: "82ee9c60-c790-40a1-816a-4152f87c16e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.894852 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data" (OuterVolumeSpecName: "config-data") pod "0cff04bc-a0b8-4155-8407-4f8253faa9e3" (UID: "0cff04bc-a0b8-4155-8407-4f8253faa9e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.918161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data" (OuterVolumeSpecName: "config-data") pod "82ee9c60-c790-40a1-816a-4152f87c16e0" (UID: "82ee9c60-c790-40a1-816a-4152f87c16e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.923272 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.923310 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82ee9c60-c790-40a1-816a-4152f87c16e0-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.923320 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.923333 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cff04bc-a0b8-4155-8407-4f8253faa9e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.923344 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sww8\" (UniqueName: \"kubernetes.io/projected/82ee9c60-c790-40a1-816a-4152f87c16e0-kube-api-access-2sww8\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.923354 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ee9c60-c790-40a1-816a-4152f87c16e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.962271 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75f4f5966d-fg8q8"] Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.962560 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75f4f5966d-fg8q8" podUID="ea014130-54ae-451e-a870-aacb43d98f25" containerName="neutron-api" containerID="cri-o://d9aa7fa9d05f03db36a66470ebb238e22cabd8e8e11aaace35086635c13ed054" gracePeriod=30 Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.963326 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75f4f5966d-fg8q8" podUID="ea014130-54ae-451e-a870-aacb43d98f25" containerName="neutron-httpd" containerID="cri-o://192e509108abac65efb1d060ec5da09ac6bf2c544df24eb04d4592ef166a835a" gracePeriod=30 Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.984279 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fd79b999-7w7qq" event={"ID":"82ee9c60-c790-40a1-816a-4152f87c16e0","Type":"ContainerDied","Data":"1699856c3ee2daa565d2439c06b44178988b10e25e4b421a21a662364641fada"} Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.984417 4743 scope.go:117] "RemoveContainer" containerID="8b872ebecc6a2a29bc444153e8f498acea78073dee93b1d2f2eeb719777ad36b" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.984363 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6fd79b999-7w7qq" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.995471 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.995501 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" Mar 10 15:28:14 crc kubenswrapper[4743]: I0310 15:28:14.995548 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c7d566dd6-n2r2w" event={"ID":"0cff04bc-a0b8-4155-8407-4f8253faa9e3","Type":"ContainerDied","Data":"a2523c50d60a3e00a86ffee03ac7e3ab470e861d5fe67bca9f602be42708e8df"} Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.023107 4743 scope.go:117] "RemoveContainer" containerID="2a095a8acf8c8cc66bb6e09146ffbd5e9facaf71496efad0a082b57c3af62541" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.029507 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.035337 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a14203e7-ab92-4d5b-86af-d253ddd6b215" podUID="da9eebbd-7a82-441b-8ca6-14657357a1f0" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.064879 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6fd79b999-7w7qq"] Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.074166 4743 scope.go:117] "RemoveContainer" containerID="ce60b74ef61fbf223c1bc21dab3d5d62cd5f5056838b8d940cf9e51fcdd94d94" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.117034 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6fd79b999-7w7qq"] Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.125466 4743 scope.go:117] "RemoveContainer" containerID="2eec530f630fcd1bb36338a94574510858c4a559aa4bb1813f6b72cc91a043a3" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.126286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config\") pod \"a14203e7-ab92-4d5b-86af-d253ddd6b215\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.126349 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config-secret\") pod \"a14203e7-ab92-4d5b-86af-d253ddd6b215\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.126424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-combined-ca-bundle\") pod \"a14203e7-ab92-4d5b-86af-d253ddd6b215\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.126507 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b2sg\" (UniqueName: \"kubernetes.io/projected/a14203e7-ab92-4d5b-86af-d253ddd6b215-kube-api-access-5b2sg\") pod \"a14203e7-ab92-4d5b-86af-d253ddd6b215\" (UID: \"a14203e7-ab92-4d5b-86af-d253ddd6b215\") " Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.128128 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a14203e7-ab92-4d5b-86af-d253ddd6b215" (UID: "a14203e7-ab92-4d5b-86af-d253ddd6b215"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.133963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a14203e7-ab92-4d5b-86af-d253ddd6b215" (UID: "a14203e7-ab92-4d5b-86af-d253ddd6b215"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.155204 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a14203e7-ab92-4d5b-86af-d253ddd6b215" (UID: "a14203e7-ab92-4d5b-86af-d253ddd6b215"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.160723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14203e7-ab92-4d5b-86af-d253ddd6b215-kube-api-access-5b2sg" (OuterVolumeSpecName: "kube-api-access-5b2sg") pod "a14203e7-ab92-4d5b-86af-d253ddd6b215" (UID: "a14203e7-ab92-4d5b-86af-d253ddd6b215"). InnerVolumeSpecName "kube-api-access-5b2sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.176094 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6c7d566dd6-n2r2w"] Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.197085 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6c7d566dd6-n2r2w"] Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.229110 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.229503 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.229521 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14203e7-ab92-4d5b-86af-d253ddd6b215-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.229533 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b2sg\" (UniqueName: \"kubernetes.io/projected/a14203e7-ab92-4d5b-86af-d253ddd6b215-kube-api-access-5b2sg\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.432516 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.525980 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7457b8c8b7-rzp6b"] Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.771679 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.902244 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.902585 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="ceilometer-central-agent" containerID="cri-o://230b1bc273437d1877bdcf2e4a915dd66e45a9a78f449e76179bc080091e4000" gracePeriod=30 Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.903440 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="sg-core" containerID="cri-o://5d8c01218feae892cd3966a52852ea056dc65d8845fda0bf23283cc668ceeee7" gracePeriod=30 Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.903457 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="ceilometer-notification-agent" containerID="cri-o://67c2317d09f90aa678ba4358c6d39f9ed608fd6a422c895fb4a0d598a2b0ff10" gracePeriod=30 Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.903457 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="proxy-httpd" containerID="cri-o://650e5574e9bb87746a05a9434a5878c5d6bd44c0f3bb2fcd74afc13f777dcded" gracePeriod=30 Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.950533 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" path="/var/lib/kubelet/pods/0cff04bc-a0b8-4155-8407-4f8253faa9e3/volumes" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.956259 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" path="/var/lib/kubelet/pods/82ee9c60-c790-40a1-816a-4152f87c16e0/volumes" Mar 10 15:28:15 crc kubenswrapper[4743]: I0310 15:28:15.957611 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14203e7-ab92-4d5b-86af-d253ddd6b215" path="/var/lib/kubelet/pods/a14203e7-ab92-4d5b-86af-d253ddd6b215/volumes" Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.014770 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.180:3000/\": read tcp 10.217.0.2:54346->10.217.0.180:3000: read: connection reset by peer" Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.066268 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" event={"ID":"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53","Type":"ContainerStarted","Data":"866d48f832666dfe1a13dd03d396392f6a0e826b46c44400253d734771b007cc"} Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.066341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" event={"ID":"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53","Type":"ContainerStarted","Data":"4bc3bf98ad04490378fc85334a29c36e6d875dce947e4e0f06b1600f4b127eb7"} Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.116431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da9eebbd-7a82-441b-8ca6-14657357a1f0","Type":"ContainerStarted","Data":"be6da27c8a7844ae5c0051b36122fa4ff6c5fb065a2c3801235ec2c8c83e441a"} Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.130378 4743 generic.go:334] "Generic (PLEG): container finished" podID="ea014130-54ae-451e-a870-aacb43d98f25" containerID="192e509108abac65efb1d060ec5da09ac6bf2c544df24eb04d4592ef166a835a" exitCode=0 Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.130477 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f4f5966d-fg8q8" event={"ID":"ea014130-54ae-451e-a870-aacb43d98f25","Type":"ContainerDied","Data":"192e509108abac65efb1d060ec5da09ac6bf2c544df24eb04d4592ef166a835a"} Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.142131 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.143606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"4b909bf2-989f-40bd-87ce-ccd96ec9d39e","Type":"ContainerStarted","Data":"a64e965a332953984abf635ffef38ca101116725561da9efe40af8eda4b37a1a"} Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.143646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"4b909bf2-989f-40bd-87ce-ccd96ec9d39e","Type":"ContainerStarted","Data":"cdb32eba8d1a4268f99e0845311ce0a7ee5d539732236cc6b46d5b0a69325b32"} Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.172306 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a14203e7-ab92-4d5b-86af-d253ddd6b215" podUID="da9eebbd-7a82-441b-8ca6-14657357a1f0" Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.183570 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.438755261 podStartE2EDuration="21.18355212s" podCreationTimestamp="2026-03-10 15:27:55 +0000 UTC" firstStartedPulling="2026-03-10 15:27:56.558028194 +0000 UTC m=+1341.264842942" lastFinishedPulling="2026-03-10 15:28:14.302825053 +0000 UTC m=+1359.009639801" observedRunningTime="2026-03-10 15:28:16.167679 +0000 UTC m=+1360.874493748" watchObservedRunningTime="2026-03-10 15:28:16.18355212 +0000 UTC m=+1360.890366868" Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.199921 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a14203e7-ab92-4d5b-86af-d253ddd6b215" podUID="da9eebbd-7a82-441b-8ca6-14657357a1f0" Mar 10 15:28:16 crc kubenswrapper[4743]: E0310 15:28:16.240048 4743 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/41200a6fdec7aa491d16aeabb4b3c0332f75c725bb6ace8d2adbf974dae2afb4/diff" to get inode usage: stat /var/lib/containers/storage/overlay/41200a6fdec7aa491d16aeabb4b3c0332f75c725bb6ace8d2adbf974dae2afb4/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_482b3103-f6d6-410f-9106-b10ad1695c78/ceilometer-notification-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_482b3103-f6d6-410f-9106-b10ad1695c78/ceilometer-notification-agent/0.log: no such file or directory Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.692508 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 10 15:28:16 crc kubenswrapper[4743]: I0310 15:28:16.914763 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.162025 4743 generic.go:334] "Generic (PLEG): container finished" podID="ea014130-54ae-451e-a870-aacb43d98f25" containerID="d9aa7fa9d05f03db36a66470ebb238e22cabd8e8e11aaace35086635c13ed054" exitCode=0 Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.162122 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f4f5966d-fg8q8" event={"ID":"ea014130-54ae-451e-a870-aacb43d98f25","Type":"ContainerDied","Data":"d9aa7fa9d05f03db36a66470ebb238e22cabd8e8e11aaace35086635c13ed054"} Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.187500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" event={"ID":"5d2c4fc9-7b3d-457e-af7d-52e1cda83b53","Type":"ContainerStarted","Data":"c5d6fd60e49d550a8544717a28441d28feecec907955b0433392c7d4cae54ea1"} Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.188657 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.188709 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.201908 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerID="650e5574e9bb87746a05a9434a5878c5d6bd44c0f3bb2fcd74afc13f777dcded" exitCode=0 Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.201960 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerID="5d8c01218feae892cd3966a52852ea056dc65d8845fda0bf23283cc668ceeee7" exitCode=2 Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.201973 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerID="67c2317d09f90aa678ba4358c6d39f9ed608fd6a422c895fb4a0d598a2b0ff10" exitCode=0 Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.201981 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerID="230b1bc273437d1877bdcf2e4a915dd66e45a9a78f449e76179bc080091e4000" exitCode=0 Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.202039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerDied","Data":"650e5574e9bb87746a05a9434a5878c5d6bd44c0f3bb2fcd74afc13f777dcded"} Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.202100 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerDied","Data":"5d8c01218feae892cd3966a52852ea056dc65d8845fda0bf23283cc668ceeee7"} Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.202113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerDied","Data":"67c2317d09f90aa678ba4358c6d39f9ed608fd6a422c895fb4a0d598a2b0ff10"} Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.202123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerDied","Data":"230b1bc273437d1877bdcf2e4a915dd66e45a9a78f449e76179bc080091e4000"} Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.232923 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" podStartSLOduration=3.23290072 podStartE2EDuration="3.23290072s" podCreationTimestamp="2026-03-10 15:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:17.227019773 +0000 UTC m=+1361.933834521" watchObservedRunningTime="2026-03-10 15:28:17.23290072 +0000 UTC m=+1361.939715468" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.373111 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.386283 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424158 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-run-httpd\") pod \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424224 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jbt7\" (UniqueName: \"kubernetes.io/projected/ae9e10bd-56f9-4223-a2ea-9eadfe923042-kube-api-access-8jbt7\") pod \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424270 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-ovndb-tls-certs\") pod \"ea014130-54ae-451e-a870-aacb43d98f25\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424369 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-scripts\") pod \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-log-httpd\") pod \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424453 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-combined-ca-bundle\") pod \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424548 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae9e10bd-56f9-4223-a2ea-9eadfe923042" (UID: "ae9e10bd-56f9-4223-a2ea-9eadfe923042"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424662 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-sg-core-conf-yaml\") pod \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dgl8\" (UniqueName: \"kubernetes.io/projected/ea014130-54ae-451e-a870-aacb43d98f25-kube-api-access-5dgl8\") pod \"ea014130-54ae-451e-a870-aacb43d98f25\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424766 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-config\") pod \"ea014130-54ae-451e-a870-aacb43d98f25\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424842 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-combined-ca-bundle\") pod \"ea014130-54ae-451e-a870-aacb43d98f25\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424865 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-httpd-config\") pod \"ea014130-54ae-451e-a870-aacb43d98f25\" (UID: \"ea014130-54ae-451e-a870-aacb43d98f25\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424854 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae9e10bd-56f9-4223-a2ea-9eadfe923042" (UID: "ae9e10bd-56f9-4223-a2ea-9eadfe923042"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.424890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-config-data\") pod \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\" (UID: \"ae9e10bd-56f9-4223-a2ea-9eadfe923042\") " Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.433348 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea014130-54ae-451e-a870-aacb43d98f25-kube-api-access-5dgl8" (OuterVolumeSpecName: "kube-api-access-5dgl8") pod "ea014130-54ae-451e-a870-aacb43d98f25" (UID: "ea014130-54ae-451e-a870-aacb43d98f25"). InnerVolumeSpecName "kube-api-access-5dgl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.434723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9e10bd-56f9-4223-a2ea-9eadfe923042-kube-api-access-8jbt7" (OuterVolumeSpecName: "kube-api-access-8jbt7") pod "ae9e10bd-56f9-4223-a2ea-9eadfe923042" (UID: "ae9e10bd-56f9-4223-a2ea-9eadfe923042"). InnerVolumeSpecName "kube-api-access-8jbt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.444649 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.444686 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dgl8\" (UniqueName: \"kubernetes.io/projected/ea014130-54ae-451e-a870-aacb43d98f25-kube-api-access-5dgl8\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.444702 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae9e10bd-56f9-4223-a2ea-9eadfe923042-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.444712 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jbt7\" (UniqueName: \"kubernetes.io/projected/ae9e10bd-56f9-4223-a2ea-9eadfe923042-kube-api-access-8jbt7\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.447185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ea014130-54ae-451e-a870-aacb43d98f25" (UID: "ea014130-54ae-451e-a870-aacb43d98f25"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.447303 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-scripts" (OuterVolumeSpecName: "scripts") pod "ae9e10bd-56f9-4223-a2ea-9eadfe923042" (UID: "ae9e10bd-56f9-4223-a2ea-9eadfe923042"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.495501 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae9e10bd-56f9-4223-a2ea-9eadfe923042" (UID: "ae9e10bd-56f9-4223-a2ea-9eadfe923042"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.525963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-config" (OuterVolumeSpecName: "config") pod "ea014130-54ae-451e-a870-aacb43d98f25" (UID: "ea014130-54ae-451e-a870-aacb43d98f25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.547032 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.547065 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.547074 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.547083 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.556937 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea014130-54ae-451e-a870-aacb43d98f25" (UID: "ea014130-54ae-451e-a870-aacb43d98f25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.570969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae9e10bd-56f9-4223-a2ea-9eadfe923042" (UID: "ae9e10bd-56f9-4223-a2ea-9eadfe923042"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.586861 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ea014130-54ae-451e-a870-aacb43d98f25" (UID: "ea014130-54ae-451e-a870-aacb43d98f25"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.621508 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-config-data" (OuterVolumeSpecName: "config-data") pod "ae9e10bd-56f9-4223-a2ea-9eadfe923042" (UID: "ae9e10bd-56f9-4223-a2ea-9eadfe923042"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.649530 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.649577 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.649593 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9e10bd-56f9-4223-a2ea-9eadfe923042-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:17 crc kubenswrapper[4743]: I0310 15:28:17.649606 4743 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea014130-54ae-451e-a870-aacb43d98f25-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.218112 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75f4f5966d-fg8q8" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.218151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f4f5966d-fg8q8" event={"ID":"ea014130-54ae-451e-a870-aacb43d98f25","Type":"ContainerDied","Data":"c2848a8283f1c8452c92e1cbd25219a8af27aead1be8d49904aeba086de09651"} Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.218204 4743 scope.go:117] "RemoveContainer" containerID="192e509108abac65efb1d060ec5da09ac6bf2c544df24eb04d4592ef166a835a" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.228654 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae9e10bd-56f9-4223-a2ea-9eadfe923042","Type":"ContainerDied","Data":"872f8243a2ec3bdae89d1e1f5a5ec93b22661aaa5301012aa6e2153d11ec5869"} Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.228833 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.287905 4743 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/3c64a9e90b74997a0dcbb6b4aba6979e2613eef10d518ae221cf72bdcc2ccf12/diff" to get inode usage: stat /var/lib/containers/storage/overlay/3c64a9e90b74997a0dcbb6b4aba6979e2613eef10d518ae221cf72bdcc2ccf12/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_neutron-75f4f5966d-fg8q8_ea014130-54ae-451e-a870-aacb43d98f25/neutron-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_neutron-75f4f5966d-fg8q8_ea014130-54ae-451e-a870-aacb43d98f25/neutron-httpd/0.log: no such file or directory Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.290707 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75f4f5966d-fg8q8"] Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.302743 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-75f4f5966d-fg8q8"] Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.314337 4743 scope.go:117] "RemoveContainer" containerID="d9aa7fa9d05f03db36a66470ebb238e22cabd8e8e11aaace35086635c13ed054" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.328500 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.344915 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.362443 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.362939 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="ceilometer-notification-agent" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.362953 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="ceilometer-notification-agent" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.362968 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerName="barbican-keystone-listener-log" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.362976 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerName="barbican-keystone-listener-log" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.362995 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="ceilometer-central-agent" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363005 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="ceilometer-central-agent" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.363018 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerName="barbican-worker-log" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363025 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerName="barbican-worker-log" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.363037 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea014130-54ae-451e-a870-aacb43d98f25" containerName="neutron-httpd" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363044 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea014130-54ae-451e-a870-aacb43d98f25" containerName="neutron-httpd" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.363054 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="sg-core" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363060 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="sg-core" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.363071 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerName="barbican-worker" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363076 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerName="barbican-worker" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.363088 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="proxy-httpd" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363093 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="proxy-httpd" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.363108 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea014130-54ae-451e-a870-aacb43d98f25" containerName="neutron-api" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363116 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea014130-54ae-451e-a870-aacb43d98f25" containerName="neutron-api" Mar 10 15:28:18 crc kubenswrapper[4743]: E0310 15:28:18.363131 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerName="barbican-keystone-listener" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363136 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerName="barbican-keystone-listener" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363333 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerName="barbican-worker-log" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363349 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ee9c60-c790-40a1-816a-4152f87c16e0" containerName="barbican-worker" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363366 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerName="barbican-keystone-listener-log" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363378 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea014130-54ae-451e-a870-aacb43d98f25" containerName="neutron-api" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363389 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cff04bc-a0b8-4155-8407-4f8253faa9e3" containerName="barbican-keystone-listener" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363400 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="ceilometer-notification-agent" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363431 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="ceilometer-central-agent" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363442 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="proxy-httpd" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363452 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" containerName="sg-core" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363462 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea014130-54ae-451e-a870-aacb43d98f25" containerName="neutron-httpd" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.363467 4743 scope.go:117] "RemoveContainer" containerID="650e5574e9bb87746a05a9434a5878c5d6bd44c0f3bb2fcd74afc13f777dcded" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.365187 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.372756 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.373081 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.374926 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.457553 4743 scope.go:117] "RemoveContainer" containerID="5d8c01218feae892cd3966a52852ea056dc65d8845fda0bf23283cc668ceeee7" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.473074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgpsh\" (UniqueName: \"kubernetes.io/projected/0da7188b-76b5-4b5c-9b73-061bc70b044d-kube-api-access-zgpsh\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.473163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-scripts\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.473215 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-run-httpd\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.473231 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.473265 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-log-httpd\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.473297 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.473341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-config-data\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.531541 4743 scope.go:117] "RemoveContainer" containerID="67c2317d09f90aa678ba4358c6d39f9ed608fd6a422c895fb4a0d598a2b0ff10" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.562375 4743 scope.go:117] "RemoveContainer" containerID="230b1bc273437d1877bdcf2e4a915dd66e45a9a78f449e76179bc080091e4000" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.579450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-scripts\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.579769 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-run-httpd\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.580132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.580199 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-log-httpd\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.580228 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-run-httpd\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.580252 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.581112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-log-httpd\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.581244 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-config-data\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.581446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgpsh\" (UniqueName: \"kubernetes.io/projected/0da7188b-76b5-4b5c-9b73-061bc70b044d-kube-api-access-zgpsh\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.584203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.586189 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-config-data\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.588464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.601964 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-scripts\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.612520 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgpsh\" (UniqueName: \"kubernetes.io/projected/0da7188b-76b5-4b5c-9b73-061bc70b044d-kube-api-access-zgpsh\") pod \"ceilometer-0\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " pod="openstack/ceilometer-0" Mar 10 15:28:18 crc kubenswrapper[4743]: I0310 15:28:18.756040 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:19 crc kubenswrapper[4743]: I0310 15:28:19.297880 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:19 crc kubenswrapper[4743]: I0310 15:28:19.464672 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 10 15:28:19 crc kubenswrapper[4743]: I0310 15:28:19.510647 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:28:19 crc kubenswrapper[4743]: I0310 15:28:19.934383 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae9e10bd-56f9-4223-a2ea-9eadfe923042" path="/var/lib/kubelet/pods/ae9e10bd-56f9-4223-a2ea-9eadfe923042/volumes" Mar 10 15:28:19 crc kubenswrapper[4743]: I0310 15:28:19.936029 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea014130-54ae-451e-a870-aacb43d98f25" path="/var/lib/kubelet/pods/ea014130-54ae-451e-a870-aacb43d98f25/volumes" Mar 10 15:28:20 crc kubenswrapper[4743]: I0310 15:28:20.234461 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:20 crc kubenswrapper[4743]: I0310 15:28:20.265138 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerName="manila-scheduler" containerID="cri-o://71096faa0e0eeec1fe27cbe9fdbb571288c7b89b0a890a8aa52a127d11c51f69" gracePeriod=30 Mar 10 15:28:20 crc kubenswrapper[4743]: I0310 15:28:20.265386 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerStarted","Data":"6010b19f4be05d160cf049a7d05bf3cbadafd0e2a58035676a5bee21950c3aec"} Mar 10 15:28:20 crc kubenswrapper[4743]: I0310 15:28:20.265418 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerStarted","Data":"99d967202aade8dbc5b45c3419595d8f5621f4fa44ab8bb34e844be344e7c0a2"} Mar 10 15:28:20 crc kubenswrapper[4743]: I0310 15:28:20.265684 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerName="probe" containerID="cri-o://439d72982dac18daacc3f0d4f42611ef8a18e0f95664b46c08015c1d0ca4f536" gracePeriod=30 Mar 10 15:28:20 crc kubenswrapper[4743]: I0310 15:28:20.747949 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 10 15:28:20 crc kubenswrapper[4743]: I0310 15:28:20.855104 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-958fd895b-mxn2t" podUID="cccf05c8-d4e8-4a1d-912f-5f4a37440ac7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.277074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerStarted","Data":"7855c5beceb4ee5a4847552ceae24450b14a4577211aa818bd718a1b1bc4f6a1"} Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.280394 4743 generic.go:334] "Generic (PLEG): container finished" podID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerID="439d72982dac18daacc3f0d4f42611ef8a18e0f95664b46c08015c1d0ca4f536" exitCode=0 Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.280427 4743 generic.go:334] "Generic (PLEG): container finished" podID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerID="71096faa0e0eeec1fe27cbe9fdbb571288c7b89b0a890a8aa52a127d11c51f69" exitCode=0 Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.280451 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"98c2fbbb-1c4c-421c-9d27-fae1884c9b54","Type":"ContainerDied","Data":"439d72982dac18daacc3f0d4f42611ef8a18e0f95664b46c08015c1d0ca4f536"} Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.280479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"98c2fbbb-1c4c-421c-9d27-fae1884c9b54","Type":"ContainerDied","Data":"71096faa0e0eeec1fe27cbe9fdbb571288c7b89b0a890a8aa52a127d11c51f69"} Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.364616 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.465580 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-etc-machine-id\") pod \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.465679 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data\") pod \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.465700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data-custom\") pod \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.465829 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-scripts\") pod \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.465872 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8wrx\" (UniqueName: \"kubernetes.io/projected/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-kube-api-access-q8wrx\") pod \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.465888 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-combined-ca-bundle\") pod \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\" (UID: \"98c2fbbb-1c4c-421c-9d27-fae1884c9b54\") " Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.482189 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "98c2fbbb-1c4c-421c-9d27-fae1884c9b54" (UID: "98c2fbbb-1c4c-421c-9d27-fae1884c9b54"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.495056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-scripts" (OuterVolumeSpecName: "scripts") pod "98c2fbbb-1c4c-421c-9d27-fae1884c9b54" (UID: "98c2fbbb-1c4c-421c-9d27-fae1884c9b54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.495234 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "98c2fbbb-1c4c-421c-9d27-fae1884c9b54" (UID: "98c2fbbb-1c4c-421c-9d27-fae1884c9b54"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.503149 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-kube-api-access-q8wrx" (OuterVolumeSpecName: "kube-api-access-q8wrx") pod "98c2fbbb-1c4c-421c-9d27-fae1884c9b54" (UID: "98c2fbbb-1c4c-421c-9d27-fae1884c9b54"). InnerVolumeSpecName "kube-api-access-q8wrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.574222 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.574258 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.574267 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.574276 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8wrx\" (UniqueName: \"kubernetes.io/projected/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-kube-api-access-q8wrx\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.575666 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98c2fbbb-1c4c-421c-9d27-fae1884c9b54" (UID: "98c2fbbb-1c4c-421c-9d27-fae1884c9b54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.683313 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.684671 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data" (OuterVolumeSpecName: "config-data") pod "98c2fbbb-1c4c-421c-9d27-fae1884c9b54" (UID: "98c2fbbb-1c4c-421c-9d27-fae1884c9b54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4743]: I0310 15:28:21.786510 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c2fbbb-1c4c-421c-9d27-fae1884c9b54-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.330622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"98c2fbbb-1c4c-421c-9d27-fae1884c9b54","Type":"ContainerDied","Data":"539c675b1f75e292301db8581ad63c1c6896f433a40c382186814d80e250c10c"} Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.331293 4743 scope.go:117] "RemoveContainer" containerID="439d72982dac18daacc3f0d4f42611ef8a18e0f95664b46c08015c1d0ca4f536" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.331586 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.348170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerStarted","Data":"a5346a8a65619168cc227a55e1d4bfdb1f32ab8057f446183674c19eabe53427"} Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.362397 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.371945 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.386035 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:28:22 crc kubenswrapper[4743]: E0310 15:28:22.390673 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerName="manila-scheduler" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.390704 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerName="manila-scheduler" Mar 10 15:28:22 crc kubenswrapper[4743]: E0310 15:28:22.390731 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerName="probe" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.390738 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerName="probe" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.390937 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerName="probe" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.390950 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" containerName="manila-scheduler" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.407035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.413329 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.434566 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.438218 4743 scope.go:117] "RemoveContainer" containerID="71096faa0e0eeec1fe27cbe9fdbb571288c7b89b0a890a8aa52a127d11c51f69" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.532821 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.533310 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-config-data\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.533576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-scripts\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.533778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/807051a4-de7a-46e0-a230-f3e843c9ab76-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.533981 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.534105 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6swpv\" (UniqueName: \"kubernetes.io/projected/807051a4-de7a-46e0-a230-f3e843c9ab76-kube-api-access-6swpv\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.636285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-scripts\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.636381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/807051a4-de7a-46e0-a230-f3e843c9ab76-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.636434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.636459 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6swpv\" (UniqueName: \"kubernetes.io/projected/807051a4-de7a-46e0-a230-f3e843c9ab76-kube-api-access-6swpv\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.636510 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.636553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-config-data\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.636939 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/807051a4-de7a-46e0-a230-f3e843c9ab76-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.647858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-scripts\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.648430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.648656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-config-data\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.650802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807051a4-de7a-46e0-a230-f3e843c9ab76-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.669409 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6swpv\" (UniqueName: \"kubernetes.io/projected/807051a4-de7a-46e0-a230-f3e843c9ab76-kube-api-access-6swpv\") pod \"manila-scheduler-0\" (UID: \"807051a4-de7a-46e0-a230-f3e843c9ab76\") " pod="openstack/manila-scheduler-0" Mar 10 15:28:22 crc kubenswrapper[4743]: I0310 15:28:22.741543 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 10 15:28:23 crc kubenswrapper[4743]: I0310 15:28:23.300228 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 10 15:28:23 crc kubenswrapper[4743]: I0310 15:28:23.365466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"807051a4-de7a-46e0-a230-f3e843c9ab76","Type":"ContainerStarted","Data":"1b227a6dbc95e00037e2f8826e69186e41b0c8a683010508be60684011e15dbb"} Mar 10 15:28:23 crc kubenswrapper[4743]: I0310 15:28:23.441912 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 10 15:28:23 crc kubenswrapper[4743]: I0310 15:28:23.929552 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c2fbbb-1c4c-421c-9d27-fae1884c9b54" path="/var/lib/kubelet/pods/98c2fbbb-1c4c-421c-9d27-fae1884c9b54/volumes" Mar 10 15:28:24 crc kubenswrapper[4743]: I0310 15:28:24.381696 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"807051a4-de7a-46e0-a230-f3e843c9ab76","Type":"ContainerStarted","Data":"82a70aeb36297a0830bbf3d0608513d4319bc57440d876a95e605c32657f8195"} Mar 10 15:28:24 crc kubenswrapper[4743]: I0310 15:28:24.457933 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:24 crc kubenswrapper[4743]: I0310 15:28:24.461744 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7457b8c8b7-rzp6b" Mar 10 15:28:24 crc kubenswrapper[4743]: W0310 15:28:24.843417 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9e10bd_56f9_4223_a2ea_9eadfe923042.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9e10bd_56f9_4223_a2ea_9eadfe923042.slice: no such file or directory Mar 10 15:28:24 crc kubenswrapper[4743]: W0310 15:28:24.843680 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98c2fbbb_1c4c_421c_9d27_fae1884c9b54.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98c2fbbb_1c4c_421c_9d27_fae1884c9b54.slice: no such file or directory Mar 10 15:28:24 crc kubenswrapper[4743]: W0310 15:28:24.846751 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd528d011_c6fb_4786_8d66_1fc289bd91cc.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd528d011_c6fb_4786_8d66_1fc289bd91cc.slice: no such file or directory Mar 10 15:28:24 crc kubenswrapper[4743]: W0310 15:28:24.847420 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1c5df5c_43af_4b40_8a2d_1db9b79a699b.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1c5df5c_43af_4b40_8a2d_1db9b79a699b.slice: no such file or directory Mar 10 15:28:24 crc kubenswrapper[4743]: W0310 15:28:24.902847 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/user.slice/user-0.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/user.slice/user-0.slice: no such file or directory Mar 10 15:28:24 crc kubenswrapper[4743]: W0310 15:28:24.904380 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda14203e7_ab92_4d5b_86af_d253ddd6b215.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda14203e7_ab92_4d5b_86af_d253ddd6b215.slice: no such file or directory Mar 10 15:28:25 crc kubenswrapper[4743]: I0310 15:28:25.397960 4743 generic.go:334] "Generic (PLEG): container finished" podID="38279d60-7565-460d-a703-b6aac3615f2c" containerID="c8cbbb7f60ea28cb5a3d805adcaa560c6f83d49eb8df01efc76ebe87bbbc4688" exitCode=137 Mar 10 15:28:25 crc kubenswrapper[4743]: I0310 15:28:25.398671 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38279d60-7565-460d-a703-b6aac3615f2c","Type":"ContainerDied","Data":"c8cbbb7f60ea28cb5a3d805adcaa560c6f83d49eb8df01efc76ebe87bbbc4688"} Mar 10 15:28:25 crc kubenswrapper[4743]: I0310 15:28:25.647127 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 10 15:28:25 crc kubenswrapper[4743]: I0310 15:28:25.730291 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.179:8776/healthcheck\": dial tcp 10.217.0.179:8776: connect: connection refused" Mar 10 15:28:27 crc kubenswrapper[4743]: I0310 15:28:27.479071 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:28:27 crc kubenswrapper[4743]: I0310 15:28:27.489316 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64b84f4b48-6qhqj" Mar 10 15:28:27 crc kubenswrapper[4743]: I0310 15:28:27.580755 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68c9f99d4d-r9tj5"] Mar 10 15:28:27 crc kubenswrapper[4743]: I0310 15:28:27.581145 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68c9f99d4d-r9tj5" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerName="placement-log" containerID="cri-o://1ab354af79cb9e6d33738f0ab8137c4fdbed5a4de1c27dffc2ee61ff22fe1b48" gracePeriod=30 Mar 10 15:28:27 crc kubenswrapper[4743]: I0310 15:28:27.581632 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68c9f99d4d-r9tj5" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerName="placement-api" containerID="cri-o://ab8c8a1d1914ee17944093b828600d4e6fa46ebe86810d67bef6c56595fd288d" gracePeriod=30 Mar 10 15:28:27 crc kubenswrapper[4743]: I0310 15:28:27.987964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 10 15:28:28 crc kubenswrapper[4743]: I0310 15:28:28.057677 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:28:28 crc kubenswrapper[4743]: I0310 15:28:28.482551 4743 generic.go:334] "Generic (PLEG): container finished" podID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerID="1ab354af79cb9e6d33738f0ab8137c4fdbed5a4de1c27dffc2ee61ff22fe1b48" exitCode=143 Mar 10 15:28:28 crc kubenswrapper[4743]: I0310 15:28:28.483069 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerName="manila-share" containerID="cri-o://cdb32eba8d1a4268f99e0845311ce0a7ee5d539732236cc6b46d5b0a69325b32" gracePeriod=30 Mar 10 15:28:28 crc kubenswrapper[4743]: I0310 15:28:28.482766 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68c9f99d4d-r9tj5" event={"ID":"48b97e49-cad0-4482-ba00-13ba656ac6cb","Type":"ContainerDied","Data":"1ab354af79cb9e6d33738f0ab8137c4fdbed5a4de1c27dffc2ee61ff22fe1b48"} Mar 10 15:28:28 crc kubenswrapper[4743]: I0310 15:28:28.484390 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerName="probe" containerID="cri-o://a64e965a332953984abf635ffef38ca101116725561da9efe40af8eda4b37a1a" gracePeriod=30 Mar 10 15:28:29 crc kubenswrapper[4743]: I0310 15:28:29.494673 4743 generic.go:334] "Generic (PLEG): container finished" podID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerID="a64e965a332953984abf635ffef38ca101116725561da9efe40af8eda4b37a1a" exitCode=0 Mar 10 15:28:29 crc kubenswrapper[4743]: I0310 15:28:29.494710 4743 generic.go:334] "Generic (PLEG): container finished" podID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerID="cdb32eba8d1a4268f99e0845311ce0a7ee5d539732236cc6b46d5b0a69325b32" exitCode=1 Mar 10 15:28:29 crc kubenswrapper[4743]: I0310 15:28:29.494729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"4b909bf2-989f-40bd-87ce-ccd96ec9d39e","Type":"ContainerDied","Data":"a64e965a332953984abf635ffef38ca101116725561da9efe40af8eda4b37a1a"} Mar 10 15:28:29 crc kubenswrapper[4743]: I0310 15:28:29.494756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"4b909bf2-989f-40bd-87ce-ccd96ec9d39e","Type":"ContainerDied","Data":"cdb32eba8d1a4268f99e0845311ce0a7ee5d539732236cc6b46d5b0a69325b32"} Mar 10 15:28:30 crc kubenswrapper[4743]: I0310 15:28:30.729937 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.179:8776/healthcheck\": dial tcp 10.217.0.179:8776: connect: connection refused" Mar 10 15:28:31 crc kubenswrapper[4743]: I0310 15:28:31.537523 4743 generic.go:334] "Generic (PLEG): container finished" podID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerID="ab8c8a1d1914ee17944093b828600d4e6fa46ebe86810d67bef6c56595fd288d" exitCode=0 Mar 10 15:28:31 crc kubenswrapper[4743]: I0310 15:28:31.537745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68c9f99d4d-r9tj5" event={"ID":"48b97e49-cad0-4482-ba00-13ba656ac6cb","Type":"ContainerDied","Data":"ab8c8a1d1914ee17944093b828600d4e6fa46ebe86810d67bef6c56595fd288d"} Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.104853 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.144739 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjrvd\" (UniqueName: \"kubernetes.io/projected/38279d60-7565-460d-a703-b6aac3615f2c-kube-api-access-tjrvd\") pod \"38279d60-7565-460d-a703-b6aac3615f2c\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.144850 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data\") pod \"38279d60-7565-460d-a703-b6aac3615f2c\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.144991 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-combined-ca-bundle\") pod \"38279d60-7565-460d-a703-b6aac3615f2c\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.145041 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38279d60-7565-460d-a703-b6aac3615f2c-logs\") pod \"38279d60-7565-460d-a703-b6aac3615f2c\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.145087 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38279d60-7565-460d-a703-b6aac3615f2c-etc-machine-id\") pod \"38279d60-7565-460d-a703-b6aac3615f2c\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.145145 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data-custom\") pod \"38279d60-7565-460d-a703-b6aac3615f2c\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.145226 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-scripts\") pod \"38279d60-7565-460d-a703-b6aac3615f2c\" (UID: \"38279d60-7565-460d-a703-b6aac3615f2c\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.153402 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38279d60-7565-460d-a703-b6aac3615f2c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "38279d60-7565-460d-a703-b6aac3615f2c" (UID: "38279d60-7565-460d-a703-b6aac3615f2c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.154265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38279d60-7565-460d-a703-b6aac3615f2c-logs" (OuterVolumeSpecName: "logs") pod "38279d60-7565-460d-a703-b6aac3615f2c" (UID: "38279d60-7565-460d-a703-b6aac3615f2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.170124 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38279d60-7565-460d-a703-b6aac3615f2c-kube-api-access-tjrvd" (OuterVolumeSpecName: "kube-api-access-tjrvd") pod "38279d60-7565-460d-a703-b6aac3615f2c" (UID: "38279d60-7565-460d-a703-b6aac3615f2c"). InnerVolumeSpecName "kube-api-access-tjrvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.170213 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-scripts" (OuterVolumeSpecName: "scripts") pod "38279d60-7565-460d-a703-b6aac3615f2c" (UID: "38279d60-7565-460d-a703-b6aac3615f2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.170308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "38279d60-7565-460d-a703-b6aac3615f2c" (UID: "38279d60-7565-460d-a703-b6aac3615f2c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.216966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38279d60-7565-460d-a703-b6aac3615f2c" (UID: "38279d60-7565-460d-a703-b6aac3615f2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.221708 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.248504 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.248545 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjrvd\" (UniqueName: \"kubernetes.io/projected/38279d60-7565-460d-a703-b6aac3615f2c-kube-api-access-tjrvd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.248559 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.248574 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38279d60-7565-460d-a703-b6aac3615f2c-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.248585 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38279d60-7565-460d-a703-b6aac3615f2c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.248596 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.257789 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.337970 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data" (OuterVolumeSpecName: "config-data") pod "38279d60-7565-460d-a703-b6aac3615f2c" (UID: "38279d60-7565-460d-a703-b6aac3615f2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351087 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-combined-ca-bundle\") pod \"48b97e49-cad0-4482-ba00-13ba656ac6cb\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-ceph\") pod \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351207 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-config-data\") pod \"48b97e49-cad0-4482-ba00-13ba656ac6cb\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351223 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-scripts\") pod \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351250 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-etc-machine-id\") pod \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351281 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data-custom\") pod \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351302 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd9bz\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-kube-api-access-zd9bz\") pod \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-var-lib-manila\") pod \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351403 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-scripts\") pod \"48b97e49-cad0-4482-ba00-13ba656ac6cb\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-internal-tls-certs\") pod \"48b97e49-cad0-4482-ba00-13ba656ac6cb\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351487 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data\") pod \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351515 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-public-tls-certs\") pod \"48b97e49-cad0-4482-ba00-13ba656ac6cb\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351562 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b97e49-cad0-4482-ba00-13ba656ac6cb-logs\") pod \"48b97e49-cad0-4482-ba00-13ba656ac6cb\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351597 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-combined-ca-bundle\") pod \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\" (UID: \"4b909bf2-989f-40bd-87ce-ccd96ec9d39e\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.351644 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x9cm\" (UniqueName: \"kubernetes.io/projected/48b97e49-cad0-4482-ba00-13ba656ac6cb-kube-api-access-7x9cm\") pod \"48b97e49-cad0-4482-ba00-13ba656ac6cb\" (UID: \"48b97e49-cad0-4482-ba00-13ba656ac6cb\") " Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.352182 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38279d60-7565-460d-a703-b6aac3615f2c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.354502 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-scripts" (OuterVolumeSpecName: "scripts") pod "4b909bf2-989f-40bd-87ce-ccd96ec9d39e" (UID: "4b909bf2-989f-40bd-87ce-ccd96ec9d39e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.360160 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4b909bf2-989f-40bd-87ce-ccd96ec9d39e" (UID: "4b909bf2-989f-40bd-87ce-ccd96ec9d39e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.360526 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "4b909bf2-989f-40bd-87ce-ccd96ec9d39e" (UID: "4b909bf2-989f-40bd-87ce-ccd96ec9d39e"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.360421 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b97e49-cad0-4482-ba00-13ba656ac6cb-logs" (OuterVolumeSpecName: "logs") pod "48b97e49-cad0-4482-ba00-13ba656ac6cb" (UID: "48b97e49-cad0-4482-ba00-13ba656ac6cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.363625 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-ceph" (OuterVolumeSpecName: "ceph") pod "4b909bf2-989f-40bd-87ce-ccd96ec9d39e" (UID: "4b909bf2-989f-40bd-87ce-ccd96ec9d39e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.368198 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b97e49-cad0-4482-ba00-13ba656ac6cb-kube-api-access-7x9cm" (OuterVolumeSpecName: "kube-api-access-7x9cm") pod "48b97e49-cad0-4482-ba00-13ba656ac6cb" (UID: "48b97e49-cad0-4482-ba00-13ba656ac6cb"). InnerVolumeSpecName "kube-api-access-7x9cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.370469 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4b909bf2-989f-40bd-87ce-ccd96ec9d39e" (UID: "4b909bf2-989f-40bd-87ce-ccd96ec9d39e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.377491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-scripts" (OuterVolumeSpecName: "scripts") pod "48b97e49-cad0-4482-ba00-13ba656ac6cb" (UID: "48b97e49-cad0-4482-ba00-13ba656ac6cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.389242 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-kube-api-access-zd9bz" (OuterVolumeSpecName: "kube-api-access-zd9bz") pod "4b909bf2-989f-40bd-87ce-ccd96ec9d39e" (UID: "4b909bf2-989f-40bd-87ce-ccd96ec9d39e"). InnerVolumeSpecName "kube-api-access-zd9bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455597 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48b97e49-cad0-4482-ba00-13ba656ac6cb-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455626 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x9cm\" (UniqueName: \"kubernetes.io/projected/48b97e49-cad0-4482-ba00-13ba656ac6cb-kube-api-access-7x9cm\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455638 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455648 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455656 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455667 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455675 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd9bz\" (UniqueName: \"kubernetes.io/projected/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-kube-api-access-zd9bz\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455683 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.455690 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.482681 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-config-data" (OuterVolumeSpecName: "config-data") pod "48b97e49-cad0-4482-ba00-13ba656ac6cb" (UID: "48b97e49-cad0-4482-ba00-13ba656ac6cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.488535 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48b97e49-cad0-4482-ba00-13ba656ac6cb" (UID: "48b97e49-cad0-4482-ba00-13ba656ac6cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.525909 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b909bf2-989f-40bd-87ce-ccd96ec9d39e" (UID: "4b909bf2-989f-40bd-87ce-ccd96ec9d39e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.562175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"807051a4-de7a-46e0-a230-f3e843c9ab76","Type":"ContainerStarted","Data":"4e5c0f81eddd6427d2271217eb588132e95335a5716ce1a73f0bc3f99b7fbbd1"} Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.573624 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.573671 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.573688 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.581861 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68c9f99d4d-r9tj5" event={"ID":"48b97e49-cad0-4482-ba00-13ba656ac6cb","Type":"ContainerDied","Data":"120de600b08187c22afbc1b21e4e6de575b76f4f4941ef47cdbf10068a81d7cd"} Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.581961 4743 scope.go:117] "RemoveContainer" containerID="ab8c8a1d1914ee17944093b828600d4e6fa46ebe86810d67bef6c56595fd288d" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.582238 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68c9f99d4d-r9tj5" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.592089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerStarted","Data":"3037732e4bc0559e7b6cde4cfd03ea550f68128efb421bf7c4edb48f3c0b9e62"} Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.592723 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="ceilometer-central-agent" containerID="cri-o://6010b19f4be05d160cf049a7d05bf3cbadafd0e2a58035676a5bee21950c3aec" gracePeriod=30 Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.592889 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.592975 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="proxy-httpd" containerID="cri-o://3037732e4bc0559e7b6cde4cfd03ea550f68128efb421bf7c4edb48f3c0b9e62" gracePeriod=30 Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.593049 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="sg-core" containerID="cri-o://a5346a8a65619168cc227a55e1d4bfdb1f32ab8057f446183674c19eabe53427" gracePeriod=30 Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.593118 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="ceilometer-notification-agent" containerID="cri-o://7855c5beceb4ee5a4847552ceae24450b14a4577211aa818bd718a1b1bc4f6a1" gracePeriod=30 Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.613174 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data" (OuterVolumeSpecName: "config-data") pod "4b909bf2-989f-40bd-87ce-ccd96ec9d39e" (UID: "4b909bf2-989f-40bd-87ce-ccd96ec9d39e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.626665 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=10.626642107 podStartE2EDuration="10.626642107s" podCreationTimestamp="2026-03-10 15:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:32.584674116 +0000 UTC m=+1377.291488864" watchObservedRunningTime="2026-03-10 15:28:32.626642107 +0000 UTC m=+1377.333456855" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.633677 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=10.324380709 podStartE2EDuration="14.633659758s" podCreationTimestamp="2026-03-10 15:28:18 +0000 UTC" firstStartedPulling="2026-03-10 15:28:19.31190178 +0000 UTC m=+1364.018716528" lastFinishedPulling="2026-03-10 15:28:23.621180829 +0000 UTC m=+1368.327995577" observedRunningTime="2026-03-10 15:28:32.631506106 +0000 UTC m=+1377.338320854" watchObservedRunningTime="2026-03-10 15:28:32.633659758 +0000 UTC m=+1377.340474506" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.651733 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"4b909bf2-989f-40bd-87ce-ccd96ec9d39e","Type":"ContainerDied","Data":"1d8f5503ae1c00b7df8210ca51791146eb07edb8a081650513e943b537567148"} Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.660018 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.670433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38279d60-7565-460d-a703-b6aac3615f2c","Type":"ContainerDied","Data":"555e7d56ce1ea5496f618063bd838d008834a067c6d19263ca93a1a6aad9611a"} Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.670581 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.672050 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "48b97e49-cad0-4482-ba00-13ba656ac6cb" (UID: "48b97e49-cad0-4482-ba00-13ba656ac6cb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.674986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "48b97e49-cad0-4482-ba00-13ba656ac6cb" (UID: "48b97e49-cad0-4482-ba00-13ba656ac6cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.676958 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.676986 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b909bf2-989f-40bd-87ce-ccd96ec9d39e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.676997 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b97e49-cad0-4482-ba00-13ba656ac6cb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.684829 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da9eebbd-7a82-441b-8ca6-14657357a1f0","Type":"ContainerStarted","Data":"e1016b1255d677813bad133f5bb3a5fd7cb82defe7d52354735fa1963d69dbc6"} Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.732326 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=6.575289103 podStartE2EDuration="22.73228726s" podCreationTimestamp="2026-03-10 15:28:10 +0000 UTC" firstStartedPulling="2026-03-10 15:28:15.451939379 +0000 UTC m=+1360.158754127" lastFinishedPulling="2026-03-10 15:28:31.608937536 +0000 UTC m=+1376.315752284" observedRunningTime="2026-03-10 15:28:32.704455364 +0000 UTC m=+1377.411270112" watchObservedRunningTime="2026-03-10 15:28:32.73228726 +0000 UTC m=+1377.439102008" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.743236 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.786996 4743 scope.go:117] "RemoveContainer" containerID="1ab354af79cb9e6d33738f0ab8137c4fdbed5a4de1c27dffc2ee61ff22fe1b48" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.808805 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.816725 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.817806 4743 scope.go:117] "RemoveContainer" containerID="a64e965a332953984abf635ffef38ca101116725561da9efe40af8eda4b37a1a" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.842480 4743 scope.go:117] "RemoveContainer" containerID="cdb32eba8d1a4268f99e0845311ce0a7ee5d539732236cc6b46d5b0a69325b32" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856018 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:28:32 crc kubenswrapper[4743]: E0310 15:28:32.856532 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856553 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api" Mar 10 15:28:32 crc kubenswrapper[4743]: E0310 15:28:32.856564 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerName="probe" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856571 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerName="probe" Mar 10 15:28:32 crc kubenswrapper[4743]: E0310 15:28:32.856620 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerName="placement-api" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856630 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerName="placement-api" Mar 10 15:28:32 crc kubenswrapper[4743]: E0310 15:28:32.856648 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerName="manila-share" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856654 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerName="manila-share" Mar 10 15:28:32 crc kubenswrapper[4743]: E0310 15:28:32.856667 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerName="placement-log" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856674 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerName="placement-log" Mar 10 15:28:32 crc kubenswrapper[4743]: E0310 15:28:32.856694 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api-log" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856699 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api-log" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856922 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerName="placement-log" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856935 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" containerName="placement-api" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856946 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerName="probe" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856954 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856964 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="38279d60-7565-460d-a703-b6aac3615f2c" containerName="cinder-api-log" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.856972 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" containerName="manila-share" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.858298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.862204 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.877860 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.891156 4743 scope.go:117] "RemoveContainer" containerID="c8cbbb7f60ea28cb5a3d805adcaa560c6f83d49eb8df01efc76ebe87bbbc4688" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.901218 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.922876 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.951380 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.953195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.963388 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.963876 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.964101 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.967089 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.986717 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx7fd\" (UniqueName: \"kubernetes.io/projected/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-kube-api-access-rx7fd\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.986784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.986845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.986865 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.986997 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5211b51-a212-41a2-9c1d-62e0029300e2-logs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987045 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-scripts\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987206 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-ceph\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987232 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987267 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ff46\" (UniqueName: \"kubernetes.io/projected/c5211b51-a212-41a2-9c1d-62e0029300e2-kube-api-access-7ff46\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987378 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5211b51-a212-41a2-9c1d-62e0029300e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987505 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-config-data\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987555 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-config-data\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987597 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:32 crc kubenswrapper[4743]: I0310 15:28:32.987661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-scripts\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.031226 4743 scope.go:117] "RemoveContainer" containerID="2efc484f75e97c22037a95da5ff71f40ded1470d05ea07d0728da7b449951e4b" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.061291 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68c9f99d4d-r9tj5"] Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.072682 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-68c9f99d4d-r9tj5"] Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ff46\" (UniqueName: \"kubernetes.io/projected/c5211b51-a212-41a2-9c1d-62e0029300e2-kube-api-access-7ff46\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5211b51-a212-41a2-9c1d-62e0029300e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-config-data\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-config-data\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-scripts\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089682 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx7fd\" (UniqueName: \"kubernetes.io/projected/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-kube-api-access-rx7fd\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089722 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089785 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5211b51-a212-41a2-9c1d-62e0029300e2-logs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089801 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-scripts\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-ceph\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.089933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.090044 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.090440 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5211b51-a212-41a2-9c1d-62e0029300e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.093940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.094613 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5211b51-a212-41a2-9c1d-62e0029300e2-logs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.095143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.099546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.100443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-scripts\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.100848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.104858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-config-data\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.106700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.107056 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.108266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-scripts\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.108830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.108914 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-ceph\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.114449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx7fd\" (UniqueName: \"kubernetes.io/projected/f94c3f98-f911-491e-bbae-5e5c8b3d0c10-kube-api-access-rx7fd\") pod \"manila-share-share1-0\" (UID: \"f94c3f98-f911-491e-bbae-5e5c8b3d0c10\") " pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.114664 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ff46\" (UniqueName: \"kubernetes.io/projected/c5211b51-a212-41a2-9c1d-62e0029300e2-kube-api-access-7ff46\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.116674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5211b51-a212-41a2-9c1d-62e0029300e2-config-data\") pod \"cinder-api-0\" (UID: \"c5211b51-a212-41a2-9c1d-62e0029300e2\") " pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.308014 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.333024 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.715649 4743 generic.go:334] "Generic (PLEG): container finished" podID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerID="3037732e4bc0559e7b6cde4cfd03ea550f68128efb421bf7c4edb48f3c0b9e62" exitCode=0 Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.716468 4743 generic.go:334] "Generic (PLEG): container finished" podID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerID="a5346a8a65619168cc227a55e1d4bfdb1f32ab8057f446183674c19eabe53427" exitCode=2 Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.716482 4743 generic.go:334] "Generic (PLEG): container finished" podID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerID="7855c5beceb4ee5a4847552ceae24450b14a4577211aa818bd718a1b1bc4f6a1" exitCode=0 Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.716492 4743 generic.go:334] "Generic (PLEG): container finished" podID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerID="6010b19f4be05d160cf049a7d05bf3cbadafd0e2a58035676a5bee21950c3aec" exitCode=0 Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.715862 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerDied","Data":"3037732e4bc0559e7b6cde4cfd03ea550f68128efb421bf7c4edb48f3c0b9e62"} Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.716592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerDied","Data":"a5346a8a65619168cc227a55e1d4bfdb1f32ab8057f446183674c19eabe53427"} Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.716609 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerDied","Data":"7855c5beceb4ee5a4847552ceae24450b14a4577211aa818bd718a1b1bc4f6a1"} Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.716621 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerDied","Data":"6010b19f4be05d160cf049a7d05bf3cbadafd0e2a58035676a5bee21950c3aec"} Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.716631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da7188b-76b5-4b5c-9b73-061bc70b044d","Type":"ContainerDied","Data":"99d967202aade8dbc5b45c3419595d8f5621f4fa44ab8bb34e844be344e7c0a2"} Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.716642 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99d967202aade8dbc5b45c3419595d8f5621f4fa44ab8bb34e844be344e7c0a2" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.799110 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.913872 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-scripts\") pod \"0da7188b-76b5-4b5c-9b73-061bc70b044d\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.915039 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-combined-ca-bundle\") pod \"0da7188b-76b5-4b5c-9b73-061bc70b044d\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.915066 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-log-httpd\") pod \"0da7188b-76b5-4b5c-9b73-061bc70b044d\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.915094 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgpsh\" (UniqueName: \"kubernetes.io/projected/0da7188b-76b5-4b5c-9b73-061bc70b044d-kube-api-access-zgpsh\") pod \"0da7188b-76b5-4b5c-9b73-061bc70b044d\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.915236 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-sg-core-conf-yaml\") pod \"0da7188b-76b5-4b5c-9b73-061bc70b044d\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.915256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-run-httpd\") pod \"0da7188b-76b5-4b5c-9b73-061bc70b044d\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.915282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-config-data\") pod \"0da7188b-76b5-4b5c-9b73-061bc70b044d\" (UID: \"0da7188b-76b5-4b5c-9b73-061bc70b044d\") " Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.921599 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0da7188b-76b5-4b5c-9b73-061bc70b044d" (UID: "0da7188b-76b5-4b5c-9b73-061bc70b044d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.921807 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0da7188b-76b5-4b5c-9b73-061bc70b044d" (UID: "0da7188b-76b5-4b5c-9b73-061bc70b044d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.923866 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da7188b-76b5-4b5c-9b73-061bc70b044d-kube-api-access-zgpsh" (OuterVolumeSpecName: "kube-api-access-zgpsh") pod "0da7188b-76b5-4b5c-9b73-061bc70b044d" (UID: "0da7188b-76b5-4b5c-9b73-061bc70b044d"). InnerVolumeSpecName "kube-api-access-zgpsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.931597 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38279d60-7565-460d-a703-b6aac3615f2c" path="/var/lib/kubelet/pods/38279d60-7565-460d-a703-b6aac3615f2c/volumes" Mar 10 15:28:33 crc kubenswrapper[4743]: W0310 15:28:33.932766 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5211b51_a212_41a2_9c1d_62e0029300e2.slice/crio-e609f10ec1b64e14b5f6d70912589188e59ffb9d6186812173effc9a5786f77d WatchSource:0}: Error finding container e609f10ec1b64e14b5f6d70912589188e59ffb9d6186812173effc9a5786f77d: Status 404 returned error can't find the container with id e609f10ec1b64e14b5f6d70912589188e59ffb9d6186812173effc9a5786f77d Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.933568 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b97e49-cad0-4482-ba00-13ba656ac6cb" path="/var/lib/kubelet/pods/48b97e49-cad0-4482-ba00-13ba656ac6cb/volumes" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.934785 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b909bf2-989f-40bd-87ce-ccd96ec9d39e" path="/var/lib/kubelet/pods/4b909bf2-989f-40bd-87ce-ccd96ec9d39e/volumes" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.934977 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-scripts" (OuterVolumeSpecName: "scripts") pod "0da7188b-76b5-4b5c-9b73-061bc70b044d" (UID: "0da7188b-76b5-4b5c-9b73-061bc70b044d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.940405 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:28:33 crc kubenswrapper[4743]: I0310 15:28:33.962010 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0da7188b-76b5-4b5c-9b73-061bc70b044d" (UID: "0da7188b-76b5-4b5c-9b73-061bc70b044d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.010666 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.017648 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.017680 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgpsh\" (UniqueName: \"kubernetes.io/projected/0da7188b-76b5-4b5c-9b73-061bc70b044d-kube-api-access-zgpsh\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.017690 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.017699 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da7188b-76b5-4b5c-9b73-061bc70b044d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.017707 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:34 crc kubenswrapper[4743]: W0310 15:28:34.018864 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf94c3f98_f911_491e_bbae_5e5c8b3d0c10.slice/crio-ee738949862d5840329a0e0f8e35fef8078a1191a33c75cd7408a7c28ed0e290 WatchSource:0}: Error finding container ee738949862d5840329a0e0f8e35fef8078a1191a33c75cd7408a7c28ed0e290: Status 404 returned error can't find the container with id ee738949862d5840329a0e0f8e35fef8078a1191a33c75cd7408a7c28ed0e290 Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.030034 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0da7188b-76b5-4b5c-9b73-061bc70b044d" (UID: "0da7188b-76b5-4b5c-9b73-061bc70b044d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.067610 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-config-data" (OuterVolumeSpecName: "config-data") pod "0da7188b-76b5-4b5c-9b73-061bc70b044d" (UID: "0da7188b-76b5-4b5c-9b73-061bc70b044d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.119765 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.119829 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da7188b-76b5-4b5c-9b73-061bc70b044d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.397549 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.398237 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerName="glance-httpd" containerID="cri-o://04feccecae990e1118c080ef411816b601adfa63b5b3339b6fd74c230ad7d27e" gracePeriod=30 Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.397791 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerName="glance-log" containerID="cri-o://1e5e910037c86b0214ac4e00c30ffb1b59927eb92cb3557b80803ce966bea0bc" gracePeriod=30 Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.715211 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.744760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5211b51-a212-41a2-9c1d-62e0029300e2","Type":"ContainerStarted","Data":"e609f10ec1b64e14b5f6d70912589188e59ffb9d6186812173effc9a5786f77d"} Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.753360 4743 generic.go:334] "Generic (PLEG): container finished" podID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerID="1e5e910037c86b0214ac4e00c30ffb1b59927eb92cb3557b80803ce966bea0bc" exitCode=143 Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.753474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07af3cb-3345-42d9-86c1-38bfbc27a259","Type":"ContainerDied","Data":"1e5e910037c86b0214ac4e00c30ffb1b59927eb92cb3557b80803ce966bea0bc"} Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.777041 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.781389 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f94c3f98-f911-491e-bbae-5e5c8b3d0c10","Type":"ContainerStarted","Data":"e1a7bd28ee510171f339ac4ddeda6d1cbf69dd034f68f8df4484e4fa9db393fb"} Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.781480 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f94c3f98-f911-491e-bbae-5e5c8b3d0c10","Type":"ContainerStarted","Data":"ee738949862d5840329a0e0f8e35fef8078a1191a33c75cd7408a7c28ed0e290"} Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.787918 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.846520 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.868011 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.885290 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:34 crc kubenswrapper[4743]: E0310 15:28:34.885784 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="ceilometer-central-agent" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.885806 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="ceilometer-central-agent" Mar 10 15:28:34 crc kubenswrapper[4743]: E0310 15:28:34.885847 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="proxy-httpd" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.885856 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="proxy-httpd" Mar 10 15:28:34 crc kubenswrapper[4743]: E0310 15:28:34.885867 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="sg-core" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.885873 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="sg-core" Mar 10 15:28:34 crc kubenswrapper[4743]: E0310 15:28:34.885926 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="ceilometer-notification-agent" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.885937 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="ceilometer-notification-agent" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.886283 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="ceilometer-central-agent" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.886317 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="sg-core" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.886353 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="proxy-httpd" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.886371 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" containerName="ceilometer-notification-agent" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.888730 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.892506 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.893474 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.898513 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.961185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-log-httpd\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.961595 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-config-data\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.961633 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.961675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqcxf\" (UniqueName: \"kubernetes.io/projected/dffc7cae-41b7-46f6-951e-cd016d2a61b8-kube-api-access-sqcxf\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.961701 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-scripts\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.961767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-run-httpd\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:34 crc kubenswrapper[4743]: I0310 15:28:34.961791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.066285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.066370 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqcxf\" (UniqueName: \"kubernetes.io/projected/dffc7cae-41b7-46f6-951e-cd016d2a61b8-kube-api-access-sqcxf\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.066400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-scripts\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.066453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-run-httpd\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.066478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.066632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-log-httpd\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.066648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-config-data\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.073086 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-run-httpd\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.073590 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-log-httpd\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.074427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.093138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqcxf\" (UniqueName: \"kubernetes.io/projected/dffc7cae-41b7-46f6-951e-cd016d2a61b8-kube-api-access-sqcxf\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.120763 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-scripts\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.122775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.123048 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-config-data\") pod \"ceilometer-0\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.220142 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.785710 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:35 crc kubenswrapper[4743]: W0310 15:28:35.787545 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddffc7cae_41b7_46f6_951e_cd016d2a61b8.slice/crio-c04c3aaf511db0c1327368ec80d03d4b8e4f1d3891683da90e7e724988e5f501 WatchSource:0}: Error finding container c04c3aaf511db0c1327368ec80d03d4b8e4f1d3891683da90e7e724988e5f501: Status 404 returned error can't find the container with id c04c3aaf511db0c1327368ec80d03d4b8e4f1d3891683da90e7e724988e5f501 Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.789716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f94c3f98-f911-491e-bbae-5e5c8b3d0c10","Type":"ContainerStarted","Data":"cdccc7f0b4095d820b566ea65ac6549d0a7fa34f3e7e25fa7c0e5c1b587503a7"} Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.796878 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5211b51-a212-41a2-9c1d-62e0029300e2","Type":"ContainerStarted","Data":"62c0063c915fc584118bc87d90292801d0731a7ff400082aa1846addfa6b20c3"} Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.796923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5211b51-a212-41a2-9c1d-62e0029300e2","Type":"ContainerStarted","Data":"57521a239601893edeef01f0901884bcf754f092d931fd72bb00081c26450df0"} Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.797049 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.826078 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.8260586070000002 podStartE2EDuration="3.826058607s" podCreationTimestamp="2026-03-10 15:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:35.818763329 +0000 UTC m=+1380.525578077" watchObservedRunningTime="2026-03-10 15:28:35.826058607 +0000 UTC m=+1380.532873355" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.850541 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.850321301 podStartE2EDuration="3.850321301s" podCreationTimestamp="2026-03-10 15:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:35.844293139 +0000 UTC m=+1380.551107887" watchObservedRunningTime="2026-03-10 15:28:35.850321301 +0000 UTC m=+1380.557136049" Mar 10 15:28:35 crc kubenswrapper[4743]: I0310 15:28:35.927958 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da7188b-76b5-4b5c-9b73-061bc70b044d" path="/var/lib/kubelet/pods/0da7188b-76b5-4b5c-9b73-061bc70b044d/volumes" Mar 10 15:28:36 crc kubenswrapper[4743]: I0310 15:28:36.817321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerStarted","Data":"9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34"} Mar 10 15:28:36 crc kubenswrapper[4743]: I0310 15:28:36.817597 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerStarted","Data":"c04c3aaf511db0c1327368ec80d03d4b8e4f1d3891683da90e7e724988e5f501"} Mar 10 15:28:36 crc kubenswrapper[4743]: I0310 15:28:36.864209 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-958fd895b-mxn2t" Mar 10 15:28:36 crc kubenswrapper[4743]: I0310 15:28:36.963544 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7954db6464-ns5cf"] Mar 10 15:28:36 crc kubenswrapper[4743]: I0310 15:28:36.965185 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon-log" containerID="cri-o://1a4a17b7c6b5e58ecc54899710b1a069b044db12fac0070f22bb4b2126726a01" gracePeriod=30 Mar 10 15:28:36 crc kubenswrapper[4743]: I0310 15:28:36.965792 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" containerID="cri-o://f0661b001f1e4aa05e793a6f1f5306c28f23ea2a63110c85c0439180d4cfb904" gracePeriod=30 Mar 10 15:28:36 crc kubenswrapper[4743]: I0310 15:28:36.976233 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 10 15:28:37 crc kubenswrapper[4743]: I0310 15:28:37.028684 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:37 crc kubenswrapper[4743]: I0310 15:28:37.843336 4743 generic.go:334] "Generic (PLEG): container finished" podID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerID="04feccecae990e1118c080ef411816b601adfa63b5b3339b6fd74c230ad7d27e" exitCode=0 Mar 10 15:28:37 crc kubenswrapper[4743]: I0310 15:28:37.843394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07af3cb-3345-42d9-86c1-38bfbc27a259","Type":"ContainerDied","Data":"04feccecae990e1118c080ef411816b601adfa63b5b3339b6fd74c230ad7d27e"} Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.108526 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.151480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-public-tls-certs\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.151784 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-combined-ca-bundle\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.151887 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-logs\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.152062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.152228 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-ceph\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.152355 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77s72\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-kube-api-access-77s72\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.152465 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-httpd-run\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.152547 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-scripts\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.152678 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-config-data\") pod \"a07af3cb-3345-42d9-86c1-38bfbc27a259\" (UID: \"a07af3cb-3345-42d9-86c1-38bfbc27a259\") " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.152371 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-logs" (OuterVolumeSpecName: "logs") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.153025 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.180593 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-kube-api-access-77s72" (OuterVolumeSpecName: "kube-api-access-77s72") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "kube-api-access-77s72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.184790 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.187082 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-ceph" (OuterVolumeSpecName: "ceph") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.216835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-scripts" (OuterVolumeSpecName: "scripts") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.243940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.267221 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.267263 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.267292 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.267302 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.267312 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77s72\" (UniqueName: \"kubernetes.io/projected/a07af3cb-3345-42d9-86c1-38bfbc27a259-kube-api-access-77s72\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.267324 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a07af3cb-3345-42d9-86c1-38bfbc27a259-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.267332 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.284042 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.292546 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-config-data" (OuterVolumeSpecName: "config-data") pod "a07af3cb-3345-42d9-86c1-38bfbc27a259" (UID: "a07af3cb-3345-42d9-86c1-38bfbc27a259"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.312479 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.369331 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.369589 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07af3cb-3345-42d9-86c1-38bfbc27a259-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.369663 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.856787 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.856801 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a07af3cb-3345-42d9-86c1-38bfbc27a259","Type":"ContainerDied","Data":"a3fb04a542e234f93c972f918f7d5a7eab27a87b8e1025651754c0e57e8b7cb0"} Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.858743 4743 scope.go:117] "RemoveContainer" containerID="04feccecae990e1118c080ef411816b601adfa63b5b3339b6fd74c230ad7d27e" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.859598 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerStarted","Data":"31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1"} Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.910651 4743 scope.go:117] "RemoveContainer" containerID="1e5e910037c86b0214ac4e00c30ffb1b59927eb92cb3557b80803ce966bea0bc" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.945997 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.963932 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.978515 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:28:38 crc kubenswrapper[4743]: E0310 15:28:38.985331 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerName="glance-httpd" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.985415 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerName="glance-httpd" Mar 10 15:28:38 crc kubenswrapper[4743]: E0310 15:28:38.985437 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerName="glance-log" Mar 10 15:28:38 crc kubenswrapper[4743]: I0310 15:28:38.985445 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerName="glance-log" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.018777 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerName="glance-httpd" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.018870 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" containerName="glance-log" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.022362 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.025188 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.026692 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.026825 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112231 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bvjh\" (UniqueName: \"kubernetes.io/projected/e2f464dd-fd03-486c-afac-a4e86a4e5226-kube-api-access-7bvjh\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2f464dd-fd03-486c-afac-a4e86a4e5226-ceph\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112466 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112512 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2f464dd-fd03-486c-afac-a4e86a4e5226-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.112670 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f464dd-fd03-486c-afac-a4e86a4e5226-logs\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.214398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.214458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.214491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2f464dd-fd03-486c-afac-a4e86a4e5226-ceph\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.214526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.214561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.214593 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.215025 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.216110 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2f464dd-fd03-486c-afac-a4e86a4e5226-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.216164 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f464dd-fd03-486c-afac-a4e86a4e5226-logs\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.216243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bvjh\" (UniqueName: \"kubernetes.io/projected/e2f464dd-fd03-486c-afac-a4e86a4e5226-kube-api-access-7bvjh\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.216777 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2f464dd-fd03-486c-afac-a4e86a4e5226-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.216994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f464dd-fd03-486c-afac-a4e86a4e5226-logs\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.222231 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.223233 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.224253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.237196 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2f464dd-fd03-486c-afac-a4e86a4e5226-ceph\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.239032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f464dd-fd03-486c-afac-a4e86a4e5226-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.241490 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bvjh\" (UniqueName: \"kubernetes.io/projected/e2f464dd-fd03-486c-afac-a4e86a4e5226-kube-api-access-7bvjh\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.272039 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e2f464dd-fd03-486c-afac-a4e86a4e5226\") " pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.361355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.872428 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerStarted","Data":"ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289"} Mar 10 15:28:39 crc kubenswrapper[4743]: I0310 15:28:39.927271 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a07af3cb-3345-42d9-86c1-38bfbc27a259" path="/var/lib/kubelet/pods/a07af3cb-3345-42d9-86c1-38bfbc27a259/volumes" Mar 10 15:28:40 crc kubenswrapper[4743]: W0310 15:28:40.044505 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2f464dd_fd03_486c_afac_a4e86a4e5226.slice/crio-a1f2b7a4c387bfc249342c93006193b03e7aad3f349d4a1a8493586d53c32347 WatchSource:0}: Error finding container a1f2b7a4c387bfc249342c93006193b03e7aad3f349d4a1a8493586d53c32347: Status 404 returned error can't find the container with id a1f2b7a4c387bfc249342c93006193b03e7aad3f349d4a1a8493586d53c32347 Mar 10 15:28:40 crc kubenswrapper[4743]: I0310 15:28:40.048958 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:28:40 crc kubenswrapper[4743]: I0310 15:28:40.369285 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:50848->10.217.0.156:8443: read: connection reset by peer" Mar 10 15:28:40 crc kubenswrapper[4743]: I0310 15:28:40.748332 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 10 15:28:40 crc kubenswrapper[4743]: I0310 15:28:40.888539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2f464dd-fd03-486c-afac-a4e86a4e5226","Type":"ContainerStarted","Data":"3ea8d046e33613c66857d95174323e2a2b529acfe668d73d716c40c16b0e2ffc"} Mar 10 15:28:40 crc kubenswrapper[4743]: I0310 15:28:40.888590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2f464dd-fd03-486c-afac-a4e86a4e5226","Type":"ContainerStarted","Data":"a1f2b7a4c387bfc249342c93006193b03e7aad3f349d4a1a8493586d53c32347"} Mar 10 15:28:40 crc kubenswrapper[4743]: I0310 15:28:40.891268 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0001988-feba-4afe-9068-071af12a6fd7" containerID="f0661b001f1e4aa05e793a6f1f5306c28f23ea2a63110c85c0439180d4cfb904" exitCode=0 Mar 10 15:28:40 crc kubenswrapper[4743]: I0310 15:28:40.891319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7954db6464-ns5cf" event={"ID":"c0001988-feba-4afe-9068-071af12a6fd7","Type":"ContainerDied","Data":"f0661b001f1e4aa05e793a6f1f5306c28f23ea2a63110c85c0439180d4cfb904"} Mar 10 15:28:40 crc kubenswrapper[4743]: I0310 15:28:40.891359 4743 scope.go:117] "RemoveContainer" containerID="8219c4b41eb7545ea397447365b146eeb777554480ab78e1282ead6e8b54a642" Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.252734 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.253098 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.913020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2f464dd-fd03-486c-afac-a4e86a4e5226","Type":"ContainerStarted","Data":"c9c78ccec20ead983883b66dac8a3a4edddadf73cc6b09667f4890c5d558bfc4"} Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.922519 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="sg-core" containerID="cri-o://ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289" gracePeriod=30 Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.922548 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="proxy-httpd" containerID="cri-o://7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de" gracePeriod=30 Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.922645 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="ceilometer-central-agent" containerID="cri-o://9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34" gracePeriod=30 Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.922591 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="ceilometer-notification-agent" containerID="cri-o://31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1" gracePeriod=30 Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.928725 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.928758 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerStarted","Data":"7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de"} Mar 10 15:28:41 crc kubenswrapper[4743]: I0310 15:28:41.947882 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.947865262 podStartE2EDuration="3.947865262s" podCreationTimestamp="2026-03-10 15:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:41.939476152 +0000 UTC m=+1386.646290900" watchObservedRunningTime="2026-03-10 15:28:41.947865262 +0000 UTC m=+1386.654680010" Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.735875 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.75489074 podStartE2EDuration="8.73585125s" podCreationTimestamp="2026-03-10 15:28:34 +0000 UTC" firstStartedPulling="2026-03-10 15:28:35.793593518 +0000 UTC m=+1380.500408276" lastFinishedPulling="2026-03-10 15:28:40.774554038 +0000 UTC m=+1385.481368786" observedRunningTime="2026-03-10 15:28:41.967318408 +0000 UTC m=+1386.674133166" watchObservedRunningTime="2026-03-10 15:28:42.73585125 +0000 UTC m=+1387.442665998" Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.743793 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.744352 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerName="glance-log" containerID="cri-o://105ce5d166ea3a22c561a690e615f2485bb91dc82e438a69315e11efdcb92a8e" gracePeriod=30 Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.744695 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerName="glance-httpd" containerID="cri-o://98b1e11f0be782cc4b452f74cc1e3cca3a597669f37c1f0010806d1bfb98284d" gracePeriod=30 Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.936467 4743 generic.go:334] "Generic (PLEG): container finished" podID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerID="ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289" exitCode=2 Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.936796 4743 generic.go:334] "Generic (PLEG): container finished" podID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerID="31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1" exitCode=0 Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.936542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerDied","Data":"ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289"} Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.936960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerDied","Data":"31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1"} Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.940645 4743 generic.go:334] "Generic (PLEG): container finished" podID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerID="105ce5d166ea3a22c561a690e615f2485bb91dc82e438a69315e11efdcb92a8e" exitCode=143 Mar 10 15:28:42 crc kubenswrapper[4743]: I0310 15:28:42.940718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eedca3dd-44be-41bf-b7e5-7ac48e4e7264","Type":"ContainerDied","Data":"105ce5d166ea3a22c561a690e615f2485bb91dc82e438a69315e11efdcb92a8e"} Mar 10 15:28:43 crc kubenswrapper[4743]: I0310 15:28:43.308524 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 10 15:28:43 crc kubenswrapper[4743]: I0310 15:28:43.958205 4743 generic.go:334] "Generic (PLEG): container finished" podID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerID="7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de" exitCode=0 Mar 10 15:28:43 crc kubenswrapper[4743]: I0310 15:28:43.958445 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerDied","Data":"7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de"} Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.620281 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.830199 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.953347 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-log-httpd\") pod \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.954509 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-scripts\") pod \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.954622 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-config-data\") pod \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.954237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dffc7cae-41b7-46f6-951e-cd016d2a61b8" (UID: "dffc7cae-41b7-46f6-951e-cd016d2a61b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.955462 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-sg-core-conf-yaml\") pod \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.955606 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-run-httpd\") pod \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.955744 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqcxf\" (UniqueName: \"kubernetes.io/projected/dffc7cae-41b7-46f6-951e-cd016d2a61b8-kube-api-access-sqcxf\") pod \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.955864 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-combined-ca-bundle\") pod \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\" (UID: \"dffc7cae-41b7-46f6-951e-cd016d2a61b8\") " Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.956352 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.956582 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dffc7cae-41b7-46f6-951e-cd016d2a61b8" (UID: "dffc7cae-41b7-46f6-951e-cd016d2a61b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.961809 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dffc7cae-41b7-46f6-951e-cd016d2a61b8-kube-api-access-sqcxf" (OuterVolumeSpecName: "kube-api-access-sqcxf") pod "dffc7cae-41b7-46f6-951e-cd016d2a61b8" (UID: "dffc7cae-41b7-46f6-951e-cd016d2a61b8"). InnerVolumeSpecName "kube-api-access-sqcxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.965519 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-scripts" (OuterVolumeSpecName: "scripts") pod "dffc7cae-41b7-46f6-951e-cd016d2a61b8" (UID: "dffc7cae-41b7-46f6-951e-cd016d2a61b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.970395 4743 generic.go:334] "Generic (PLEG): container finished" podID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerID="9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34" exitCode=0 Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.970460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerDied","Data":"9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34"} Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.970522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dffc7cae-41b7-46f6-951e-cd016d2a61b8","Type":"ContainerDied","Data":"c04c3aaf511db0c1327368ec80d03d4b8e4f1d3891683da90e7e724988e5f501"} Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.970542 4743 scope.go:117] "RemoveContainer" containerID="7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.970708 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:44 crc kubenswrapper[4743]: I0310 15:28:44.988638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dffc7cae-41b7-46f6-951e-cd016d2a61b8" (UID: "dffc7cae-41b7-46f6-951e-cd016d2a61b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.059007 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.059075 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.059085 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dffc7cae-41b7-46f6-951e-cd016d2a61b8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.059095 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqcxf\" (UniqueName: \"kubernetes.io/projected/dffc7cae-41b7-46f6-951e-cd016d2a61b8-kube-api-access-sqcxf\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.078952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dffc7cae-41b7-46f6-951e-cd016d2a61b8" (UID: "dffc7cae-41b7-46f6-951e-cd016d2a61b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.081280 4743 scope.go:117] "RemoveContainer" containerID="ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.103521 4743 scope.go:117] "RemoveContainer" containerID="31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.111037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-config-data" (OuterVolumeSpecName: "config-data") pod "dffc7cae-41b7-46f6-951e-cd016d2a61b8" (UID: "dffc7cae-41b7-46f6-951e-cd016d2a61b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.130724 4743 scope.go:117] "RemoveContainer" containerID="9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.156772 4743 scope.go:117] "RemoveContainer" containerID="7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de" Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.159402 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de\": container with ID starting with 7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de not found: ID does not exist" containerID="7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.159454 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de"} err="failed to get container status \"7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de\": rpc error: code = NotFound desc = could not find container \"7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de\": container with ID starting with 7f632f34d05c602cbdf2e56ff4ddb188bdfc946257f24eb003cc292ab6fbd1de not found: ID does not exist" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.159482 4743 scope.go:117] "RemoveContainer" containerID="ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.160676 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.160709 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffc7cae-41b7-46f6-951e-cd016d2a61b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.160954 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289\": container with ID starting with ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289 not found: ID does not exist" containerID="ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.160996 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289"} err="failed to get container status \"ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289\": rpc error: code = NotFound desc = could not find container \"ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289\": container with ID starting with ddcdf279d88e2df277c6e932910ef87a73e4b102e37382456c81557d64256289 not found: ID does not exist" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.161020 4743 scope.go:117] "RemoveContainer" containerID="31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1" Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.161426 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1\": container with ID starting with 31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1 not found: ID does not exist" containerID="31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.161465 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1"} err="failed to get container status \"31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1\": rpc error: code = NotFound desc = could not find container \"31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1\": container with ID starting with 31debb9e766f0e3e0a9feed5a53007542231cb3689e2f55fc6a52da07abc31e1 not found: ID does not exist" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.161492 4743 scope.go:117] "RemoveContainer" containerID="9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34" Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.161752 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34\": container with ID starting with 9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34 not found: ID does not exist" containerID="9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.161783 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34"} err="failed to get container status \"9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34\": rpc error: code = NotFound desc = could not find container \"9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34\": container with ID starting with 9f60e80d60772aec346e8e8593dba82effb012a784eb90e4324f271d29f40b34 not found: ID does not exist" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.316553 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.332069 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.341708 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.342952 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="ceilometer-notification-agent" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.342976 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="ceilometer-notification-agent" Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.343004 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="sg-core" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.343011 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="sg-core" Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.343036 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="proxy-httpd" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.343042 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="proxy-httpd" Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.343054 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="ceilometer-central-agent" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.343060 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="ceilometer-central-agent" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.343246 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="sg-core" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.343264 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="ceilometer-central-agent" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.343276 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="ceilometer-notification-agent" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.343286 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" containerName="proxy-httpd" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.346220 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.358155 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.363257 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.363454 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.469196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-config-data\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.469262 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-log-httpd\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.469299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-scripts\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.469338 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-run-httpd\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.469762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.469873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.473658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r2jq\" (UniqueName: \"kubernetes.io/projected/0699eee8-2d25-4c52-827e-b287c47934f8-kube-api-access-5r2jq\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: E0310 15:28:45.517155 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddffc7cae_41b7_46f6_951e_cd016d2a61b8.slice\": RecentStats: unable to find data in memory cache]" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.543200 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.579224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-log-httpd\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.579291 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-scripts\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.579326 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-run-httpd\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.579414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.579449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.579483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r2jq\" (UniqueName: \"kubernetes.io/projected/0699eee8-2d25-4c52-827e-b287c47934f8-kube-api-access-5r2jq\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.579647 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-config-data\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.579948 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-log-httpd\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.580012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-run-httpd\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.589765 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-config-data\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.593496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.599125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.599608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-scripts\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.602328 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r2jq\" (UniqueName: \"kubernetes.io/projected/0699eee8-2d25-4c52-827e-b287c47934f8-kube-api-access-5r2jq\") pod \"ceilometer-0\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.688453 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.928894 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dffc7cae-41b7-46f6-951e-cd016d2a61b8" path="/var/lib/kubelet/pods/dffc7cae-41b7-46f6-951e-cd016d2a61b8/volumes" Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.993498 4743 generic.go:334] "Generic (PLEG): container finished" podID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerID="98b1e11f0be782cc4b452f74cc1e3cca3a597669f37c1f0010806d1bfb98284d" exitCode=0 Mar 10 15:28:45 crc kubenswrapper[4743]: I0310 15:28:45.993564 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eedca3dd-44be-41bf-b7e5-7ac48e4e7264","Type":"ContainerDied","Data":"98b1e11f0be782cc4b452f74cc1e3cca3a597669f37c1f0010806d1bfb98284d"} Mar 10 15:28:46 crc kubenswrapper[4743]: W0310 15:28:46.184640 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0699eee8_2d25_4c52_827e_b287c47934f8.slice/crio-afa1d28e558d298911eb144b000018d62caf8d2a7f8e4867eba367f3865a1095 WatchSource:0}: Error finding container afa1d28e558d298911eb144b000018d62caf8d2a7f8e4867eba367f3865a1095: Status 404 returned error can't find the container with id afa1d28e558d298911eb144b000018d62caf8d2a7f8e4867eba367f3865a1095 Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.185182 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.319177 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400113 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-config-data\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-scripts\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-httpd-run\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400275 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400379 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-logs\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400484 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-internal-tls-certs\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400546 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-combined-ca-bundle\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400608 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-ceph\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.400649 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gvb4\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-kube-api-access-4gvb4\") pod \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\" (UID: \"eedca3dd-44be-41bf-b7e5-7ac48e4e7264\") " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.401529 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.405497 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-scripts" (OuterVolumeSpecName: "scripts") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.405774 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-logs" (OuterVolumeSpecName: "logs") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.407506 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-kube-api-access-4gvb4" (OuterVolumeSpecName: "kube-api-access-4gvb4") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "kube-api-access-4gvb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.411057 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.411452 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-ceph" (OuterVolumeSpecName: "ceph") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.437369 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.449212 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-config-data" (OuterVolumeSpecName: "config-data") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.478906 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eedca3dd-44be-41bf-b7e5-7ac48e4e7264" (UID: "eedca3dd-44be-41bf-b7e5-7ac48e4e7264"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.502910 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.502957 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-ceph\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.502972 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gvb4\" (UniqueName: \"kubernetes.io/projected/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-kube-api-access-4gvb4\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.502987 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.503001 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.503012 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.503058 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.503071 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.503081 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedca3dd-44be-41bf-b7e5-7ac48e4e7264-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.525553 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 15:28:46 crc kubenswrapper[4743]: I0310 15:28:46.604985 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.007368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eedca3dd-44be-41bf-b7e5-7ac48e4e7264","Type":"ContainerDied","Data":"14f383569976f9c3b33ee7512ba1bba581411bd6590de6876304b5a23454e84a"} Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.008133 4743 scope.go:117] "RemoveContainer" containerID="98b1e11f0be782cc4b452f74cc1e3cca3a597669f37c1f0010806d1bfb98284d" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.007386 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.011274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerStarted","Data":"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8"} Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.011327 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerStarted","Data":"afa1d28e558d298911eb144b000018d62caf8d2a7f8e4867eba367f3865a1095"} Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.040151 4743 scope.go:117] "RemoveContainer" containerID="105ce5d166ea3a22c561a690e615f2485bb91dc82e438a69315e11efdcb92a8e" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.050980 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.071902 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.093277 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:28:47 crc kubenswrapper[4743]: E0310 15:28:47.093687 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerName="glance-httpd" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.093703 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerName="glance-httpd" Mar 10 15:28:47 crc kubenswrapper[4743]: E0310 15:28:47.093731 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerName="glance-log" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.093737 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerName="glance-log" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.093962 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerName="glance-httpd" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.093988 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" containerName="glance-log" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.095051 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.099252 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.099463 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.126781 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218320 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms665\" (UniqueName: \"kubernetes.io/projected/ae118361-9b75-4d40-8145-fcefb244db30-kube-api-access-ms665\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218427 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218452 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ae118361-9b75-4d40-8145-fcefb244db30-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae118361-9b75-4d40-8145-fcefb244db30-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae118361-9b75-4d40-8145-fcefb244db30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.218934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.320790 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.320880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.320937 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms665\" (UniqueName: \"kubernetes.io/projected/ae118361-9b75-4d40-8145-fcefb244db30-kube-api-access-ms665\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.320983 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.321022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.321046 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.321366 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.321742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ae118361-9b75-4d40-8145-fcefb244db30-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.323074 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae118361-9b75-4d40-8145-fcefb244db30-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.323351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae118361-9b75-4d40-8145-fcefb244db30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.323684 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae118361-9b75-4d40-8145-fcefb244db30-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.323922 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae118361-9b75-4d40-8145-fcefb244db30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.327198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.327520 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.327542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.328892 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ae118361-9b75-4d40-8145-fcefb244db30-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.329134 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae118361-9b75-4d40-8145-fcefb244db30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.338938 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms665\" (UniqueName: \"kubernetes.io/projected/ae118361-9b75-4d40-8145-fcefb244db30-kube-api-access-ms665\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.349377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae118361-9b75-4d40-8145-fcefb244db30\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.440452 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:47 crc kubenswrapper[4743]: I0310 15:28:47.928694 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eedca3dd-44be-41bf-b7e5-7ac48e4e7264" path="/var/lib/kubelet/pods/eedca3dd-44be-41bf-b7e5-7ac48e4e7264/volumes" Mar 10 15:28:48 crc kubenswrapper[4743]: I0310 15:28:48.027159 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerStarted","Data":"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f"} Mar 10 15:28:48 crc kubenswrapper[4743]: I0310 15:28:48.082711 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.044990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae118361-9b75-4d40-8145-fcefb244db30","Type":"ContainerStarted","Data":"67029d0d06dbcf2336a0be63e3861057c35fb2bb3c5bf32e0f031f5c58ab88c8"} Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.045397 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae118361-9b75-4d40-8145-fcefb244db30","Type":"ContainerStarted","Data":"401fabd896a6e6f96540d9ca090e1f9d7f41ba9effe6b9c58df43663d141b99f"} Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.048833 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerStarted","Data":"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f"} Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.362477 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.362846 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.400283 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-lv5qs"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.401469 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.421745 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lv5qs"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.487263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbzz\" (UniqueName: \"kubernetes.io/projected/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-kube-api-access-khbzz\") pod \"nova-api-db-create-lv5qs\" (UID: \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\") " pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.487338 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-operator-scripts\") pod \"nova-api-db-create-lv5qs\" (UID: \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\") " pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.531128 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.534957 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-wkklm"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.536593 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.575486 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wkklm"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.591400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbzz\" (UniqueName: \"kubernetes.io/projected/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-kube-api-access-khbzz\") pod \"nova-api-db-create-lv5qs\" (UID: \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\") " pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.591480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-operator-scripts\") pod \"nova-cell0-db-create-wkklm\" (UID: \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\") " pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.591535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-operator-scripts\") pod \"nova-api-db-create-lv5qs\" (UID: \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\") " pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.591604 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6d4q\" (UniqueName: \"kubernetes.io/projected/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-kube-api-access-f6d4q\") pod \"nova-cell0-db-create-wkklm\" (UID: \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\") " pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.602270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-operator-scripts\") pod \"nova-api-db-create-lv5qs\" (UID: \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\") " pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.617436 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.641786 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbzz\" (UniqueName: \"kubernetes.io/projected/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-kube-api-access-khbzz\") pod \"nova-api-db-create-lv5qs\" (UID: \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\") " pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.696072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6d4q\" (UniqueName: \"kubernetes.io/projected/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-kube-api-access-f6d4q\") pod \"nova-cell0-db-create-wkklm\" (UID: \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\") " pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.696314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-operator-scripts\") pod \"nova-cell0-db-create-wkklm\" (UID: \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\") " pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.696951 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nxpnp"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.697514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-operator-scripts\") pod \"nova-cell0-db-create-wkklm\" (UID: \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\") " pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.703461 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.739285 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.743070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6d4q\" (UniqueName: \"kubernetes.io/projected/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-kube-api-access-f6d4q\") pod \"nova-cell0-db-create-wkklm\" (UID: \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\") " pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.777227 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nxpnp"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.795346 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-be97-account-create-update-9xhqq"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.796697 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.798691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0532548b-3818-4f8f-bf72-c7ba529b4a8c-operator-scripts\") pod \"nova-cell1-db-create-nxpnp\" (UID: \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\") " pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.798747 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sqnx\" (UniqueName: \"kubernetes.io/projected/0532548b-3818-4f8f-bf72-c7ba529b4a8c-kube-api-access-8sqnx\") pod \"nova-cell1-db-create-nxpnp\" (UID: \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\") " pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.800395 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.838925 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-be97-account-create-update-9xhqq"] Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.928549 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0532548b-3818-4f8f-bf72-c7ba529b4a8c-operator-scripts\") pod \"nova-cell1-db-create-nxpnp\" (UID: \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\") " pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.929480 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.931296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnr2q\" (UniqueName: \"kubernetes.io/projected/ab66fedc-1151-429b-9301-14e20af0cc44-kube-api-access-rnr2q\") pod \"nova-api-be97-account-create-update-9xhqq\" (UID: \"ab66fedc-1151-429b-9301-14e20af0cc44\") " pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.931449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sqnx\" (UniqueName: \"kubernetes.io/projected/0532548b-3818-4f8f-bf72-c7ba529b4a8c-kube-api-access-8sqnx\") pod \"nova-cell1-db-create-nxpnp\" (UID: \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\") " pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.931571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab66fedc-1151-429b-9301-14e20af0cc44-operator-scripts\") pod \"nova-api-be97-account-create-update-9xhqq\" (UID: \"ab66fedc-1151-429b-9301-14e20af0cc44\") " pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.932075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0532548b-3818-4f8f-bf72-c7ba529b4a8c-operator-scripts\") pod \"nova-cell1-db-create-nxpnp\" (UID: \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\") " pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:49 crc kubenswrapper[4743]: I0310 15:28:49.975059 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sqnx\" (UniqueName: \"kubernetes.io/projected/0532548b-3818-4f8f-bf72-c7ba529b4a8c-kube-api-access-8sqnx\") pod \"nova-cell1-db-create-nxpnp\" (UID: \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\") " pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.012266 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-392d-account-create-update-cvlg4"] Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.034467 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-392d-account-create-update-cvlg4"] Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.034598 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.037578 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.039054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab66fedc-1151-429b-9301-14e20af0cc44-operator-scripts\") pod \"nova-api-be97-account-create-update-9xhqq\" (UID: \"ab66fedc-1151-429b-9301-14e20af0cc44\") " pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.039241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnr2q\" (UniqueName: \"kubernetes.io/projected/ab66fedc-1151-429b-9301-14e20af0cc44-kube-api-access-rnr2q\") pod \"nova-api-be97-account-create-update-9xhqq\" (UID: \"ab66fedc-1151-429b-9301-14e20af0cc44\") " pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.040793 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab66fedc-1151-429b-9301-14e20af0cc44-operator-scripts\") pod \"nova-api-be97-account-create-update-9xhqq\" (UID: \"ab66fedc-1151-429b-9301-14e20af0cc44\") " pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.064400 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnr2q\" (UniqueName: \"kubernetes.io/projected/ab66fedc-1151-429b-9301-14e20af0cc44-kube-api-access-rnr2q\") pod \"nova-api-be97-account-create-update-9xhqq\" (UID: \"ab66fedc-1151-429b-9301-14e20af0cc44\") " pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.064966 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.116420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae118361-9b75-4d40-8145-fcefb244db30","Type":"ContainerStarted","Data":"957ac6eedfc5eef29e7da9eb905d3ce23ac9b1b84a1434e9dca6c20b2e10d416"} Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.116462 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.116473 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.116592 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.145967 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr4tj\" (UniqueName: \"kubernetes.io/projected/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-kube-api-access-hr4tj\") pod \"nova-cell0-392d-account-create-update-cvlg4\" (UID: \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\") " pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.146065 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-operator-scripts\") pod \"nova-cell0-392d-account-create-update-cvlg4\" (UID: \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\") " pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.147887 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-95c8-account-create-update-8vlkw"] Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.149401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.157189 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.175688 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-95c8-account-create-update-8vlkw"] Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.176533 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.176518542 podStartE2EDuration="3.176518542s" podCreationTimestamp="2026-03-10 15:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:50.142835628 +0000 UTC m=+1394.849650386" watchObservedRunningTime="2026-03-10 15:28:50.176518542 +0000 UTC m=+1394.883333290" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.249124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr4tj\" (UniqueName: \"kubernetes.io/projected/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-kube-api-access-hr4tj\") pod \"nova-cell0-392d-account-create-update-cvlg4\" (UID: \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\") " pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.249256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-operator-scripts\") pod \"nova-cell0-392d-account-create-update-cvlg4\" (UID: \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\") " pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.249363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b692e051-bc25-4039-ac2d-10d17a93a44a-operator-scripts\") pod \"nova-cell1-95c8-account-create-update-8vlkw\" (UID: \"b692e051-bc25-4039-ac2d-10d17a93a44a\") " pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.249608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rf4\" (UniqueName: \"kubernetes.io/projected/b692e051-bc25-4039-ac2d-10d17a93a44a-kube-api-access-44rf4\") pod \"nova-cell1-95c8-account-create-update-8vlkw\" (UID: \"b692e051-bc25-4039-ac2d-10d17a93a44a\") " pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.256638 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-operator-scripts\") pod \"nova-cell0-392d-account-create-update-cvlg4\" (UID: \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\") " pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.282221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr4tj\" (UniqueName: \"kubernetes.io/projected/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-kube-api-access-hr4tj\") pod \"nova-cell0-392d-account-create-update-cvlg4\" (UID: \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\") " pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.353506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rf4\" (UniqueName: \"kubernetes.io/projected/b692e051-bc25-4039-ac2d-10d17a93a44a-kube-api-access-44rf4\") pod \"nova-cell1-95c8-account-create-update-8vlkw\" (UID: \"b692e051-bc25-4039-ac2d-10d17a93a44a\") " pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.354003 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b692e051-bc25-4039-ac2d-10d17a93a44a-operator-scripts\") pod \"nova-cell1-95c8-account-create-update-8vlkw\" (UID: \"b692e051-bc25-4039-ac2d-10d17a93a44a\") " pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.354734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b692e051-bc25-4039-ac2d-10d17a93a44a-operator-scripts\") pod \"nova-cell1-95c8-account-create-update-8vlkw\" (UID: \"b692e051-bc25-4039-ac2d-10d17a93a44a\") " pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.359196 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lv5qs"] Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.377487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rf4\" (UniqueName: \"kubernetes.io/projected/b692e051-bc25-4039-ac2d-10d17a93a44a-kube-api-access-44rf4\") pod \"nova-cell1-95c8-account-create-update-8vlkw\" (UID: \"b692e051-bc25-4039-ac2d-10d17a93a44a\") " pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.417488 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.515424 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.617311 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wkklm"] Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.623907 4743 scope.go:117] "RemoveContainer" containerID="70764383e6ce924a21c83bd9b4931b0226da63de849bc2f595fe0103e5771bbd" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.758224 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.790133 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nxpnp"] Mar 10 15:28:50 crc kubenswrapper[4743]: W0310 15:28:50.827331 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0532548b_3818_4f8f_bf72_c7ba529b4a8c.slice/crio-b6df83ab5ab1b25040840061b0daf97c8e6de7767216886f6b28e06d95f223bd WatchSource:0}: Error finding container b6df83ab5ab1b25040840061b0daf97c8e6de7767216886f6b28e06d95f223bd: Status 404 returned error can't find the container with id b6df83ab5ab1b25040840061b0daf97c8e6de7767216886f6b28e06d95f223bd Mar 10 15:28:50 crc kubenswrapper[4743]: I0310 15:28:50.927678 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-be97-account-create-update-9xhqq"] Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.047837 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-392d-account-create-update-cvlg4"] Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.141362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerStarted","Data":"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706"} Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.146061 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.156390 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lv5qs" event={"ID":"ecaa5958-eab2-4af0-b0ba-43d0c91517f1","Type":"ContainerStarted","Data":"5eb0c295f33cd6777c1ef34bb445783dff19e6f5f544635d824bf84d494ce516"} Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.156443 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lv5qs" event={"ID":"ecaa5958-eab2-4af0-b0ba-43d0c91517f1","Type":"ContainerStarted","Data":"7aaae1a693d34ba4ae623a8cad201f818c70665c152bee2a5149c721c38ba142"} Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.177514 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.322165017 podStartE2EDuration="6.177494285s" podCreationTimestamp="2026-03-10 15:28:45 +0000 UTC" firstStartedPulling="2026-03-10 15:28:46.189115354 +0000 UTC m=+1390.895930102" lastFinishedPulling="2026-03-10 15:28:50.044444622 +0000 UTC m=+1394.751259370" observedRunningTime="2026-03-10 15:28:51.170374421 +0000 UTC m=+1395.877189169" watchObservedRunningTime="2026-03-10 15:28:51.177494285 +0000 UTC m=+1395.884309033" Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.182190 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-392d-account-create-update-cvlg4" event={"ID":"8c02071d-0a01-4d7d-bbd5-727544dd7fbe","Type":"ContainerStarted","Data":"56d757ce6348281cea0942eea0ece47b5106bb5e19cbf29a541e54b126b1562f"} Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.186430 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-be97-account-create-update-9xhqq" event={"ID":"ab66fedc-1151-429b-9301-14e20af0cc44","Type":"ContainerStarted","Data":"8e5e34edaf5551ef66eeb6e8ab5615c63c94a807c13bbbec370e36d4294a09e7"} Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.191653 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nxpnp" event={"ID":"0532548b-3818-4f8f-bf72-c7ba529b4a8c","Type":"ContainerStarted","Data":"b6df83ab5ab1b25040840061b0daf97c8e6de7767216886f6b28e06d95f223bd"} Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.194139 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wkklm" event={"ID":"354ad36f-b0fb-4520-b5ed-b9ca3a65579c","Type":"ContainerStarted","Data":"cb39a822f634e37b6719035bd47681f7837f265b07e4883a7a87bf6605302e08"} Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.203717 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-lv5qs" podStartSLOduration=2.203696624 podStartE2EDuration="2.203696624s" podCreationTimestamp="2026-03-10 15:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:51.188024446 +0000 UTC m=+1395.894839214" watchObservedRunningTime="2026-03-10 15:28:51.203696624 +0000 UTC m=+1395.910511372" Mar 10 15:28:51 crc kubenswrapper[4743]: I0310 15:28:51.296068 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-95c8-account-create-update-8vlkw"] Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.210216 4743 generic.go:334] "Generic (PLEG): container finished" podID="b692e051-bc25-4039-ac2d-10d17a93a44a" containerID="84baa1f756512af822219736b6626e3fc403936fa6127e5fc358eaa54b095f9b" exitCode=0 Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.210864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" event={"ID":"b692e051-bc25-4039-ac2d-10d17a93a44a","Type":"ContainerDied","Data":"84baa1f756512af822219736b6626e3fc403936fa6127e5fc358eaa54b095f9b"} Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.210902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" event={"ID":"b692e051-bc25-4039-ac2d-10d17a93a44a","Type":"ContainerStarted","Data":"ffa4095d4f8ac7b7e295ec1f4da0fb9870192b7238732f2002edca775746bc0b"} Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.214151 4743 generic.go:334] "Generic (PLEG): container finished" podID="354ad36f-b0fb-4520-b5ed-b9ca3a65579c" containerID="4f82bd6b7be4ae3036d2af4517825cf4f4e1689e3a61947d39477f32bcdea00c" exitCode=0 Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.214262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wkklm" event={"ID":"354ad36f-b0fb-4520-b5ed-b9ca3a65579c","Type":"ContainerDied","Data":"4f82bd6b7be4ae3036d2af4517825cf4f4e1689e3a61947d39477f32bcdea00c"} Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.216275 4743 generic.go:334] "Generic (PLEG): container finished" podID="ecaa5958-eab2-4af0-b0ba-43d0c91517f1" containerID="5eb0c295f33cd6777c1ef34bb445783dff19e6f5f544635d824bf84d494ce516" exitCode=0 Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.216340 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lv5qs" event={"ID":"ecaa5958-eab2-4af0-b0ba-43d0c91517f1","Type":"ContainerDied","Data":"5eb0c295f33cd6777c1ef34bb445783dff19e6f5f544635d824bf84d494ce516"} Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.219847 4743 generic.go:334] "Generic (PLEG): container finished" podID="0532548b-3818-4f8f-bf72-c7ba529b4a8c" containerID="f03eb86dbeab0d1d4f8221d962b019a810019848adf138a66b41c7b77c8f0d9a" exitCode=0 Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.219980 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nxpnp" event={"ID":"0532548b-3818-4f8f-bf72-c7ba529b4a8c","Type":"ContainerDied","Data":"f03eb86dbeab0d1d4f8221d962b019a810019848adf138a66b41c7b77c8f0d9a"} Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.236292 4743 generic.go:334] "Generic (PLEG): container finished" podID="ab66fedc-1151-429b-9301-14e20af0cc44" containerID="ff49b55f2a74a9839399581c6e40028716b275ce41497fdcd5d42c8d093f713f" exitCode=0 Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.236484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-be97-account-create-update-9xhqq" event={"ID":"ab66fedc-1151-429b-9301-14e20af0cc44","Type":"ContainerDied","Data":"ff49b55f2a74a9839399581c6e40028716b275ce41497fdcd5d42c8d093f713f"} Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.242672 4743 generic.go:334] "Generic (PLEG): container finished" podID="8c02071d-0a01-4d7d-bbd5-727544dd7fbe" containerID="35f28da742cacba26ec83c9cc64c97f16d333200d4989f59cb8e895c24055d85" exitCode=0 Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.245868 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-392d-account-create-update-cvlg4" event={"ID":"8c02071d-0a01-4d7d-bbd5-727544dd7fbe","Type":"ContainerDied","Data":"35f28da742cacba26ec83c9cc64c97f16d333200d4989f59cb8e895c24055d85"} Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.245965 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:28:52 crc kubenswrapper[4743]: I0310 15:28:52.245976 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:28:53 crc kubenswrapper[4743]: I0310 15:28:53.643594 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 15:28:53 crc kubenswrapper[4743]: I0310 15:28:53.644390 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:28:53 crc kubenswrapper[4743]: I0310 15:28:53.677141 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 15:28:53 crc kubenswrapper[4743]: I0310 15:28:53.795537 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:53 crc kubenswrapper[4743]: I0310 15:28:53.884934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0532548b-3818-4f8f-bf72-c7ba529b4a8c-operator-scripts\") pod \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\" (UID: \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\") " Mar 10 15:28:53 crc kubenswrapper[4743]: I0310 15:28:53.885124 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sqnx\" (UniqueName: \"kubernetes.io/projected/0532548b-3818-4f8f-bf72-c7ba529b4a8c-kube-api-access-8sqnx\") pod \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\" (UID: \"0532548b-3818-4f8f-bf72-c7ba529b4a8c\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:53.887432 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0532548b-3818-4f8f-bf72-c7ba529b4a8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0532548b-3818-4f8f-bf72-c7ba529b4a8c" (UID: "0532548b-3818-4f8f-bf72-c7ba529b4a8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:53.911099 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0532548b-3818-4f8f-bf72-c7ba529b4a8c-kube-api-access-8sqnx" (OuterVolumeSpecName: "kube-api-access-8sqnx") pod "0532548b-3818-4f8f-bf72-c7ba529b4a8c" (UID: "0532548b-3818-4f8f-bf72-c7ba529b4a8c"). InnerVolumeSpecName "kube-api-access-8sqnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:53.989948 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0532548b-3818-4f8f-bf72-c7ba529b4a8c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:53.989977 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sqnx\" (UniqueName: \"kubernetes.io/projected/0532548b-3818-4f8f-bf72-c7ba529b4a8c-kube-api-access-8sqnx\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.182102 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.194840 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.195260 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.197359 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab66fedc-1151-429b-9301-14e20af0cc44-operator-scripts\") pod \"ab66fedc-1151-429b-9301-14e20af0cc44\" (UID: \"ab66fedc-1151-429b-9301-14e20af0cc44\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.197427 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnr2q\" (UniqueName: \"kubernetes.io/projected/ab66fedc-1151-429b-9301-14e20af0cc44-kube-api-access-rnr2q\") pod \"ab66fedc-1151-429b-9301-14e20af0cc44\" (UID: \"ab66fedc-1151-429b-9301-14e20af0cc44\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.203403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab66fedc-1151-429b-9301-14e20af0cc44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab66fedc-1151-429b-9301-14e20af0cc44" (UID: "ab66fedc-1151-429b-9301-14e20af0cc44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.204401 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab66fedc-1151-429b-9301-14e20af0cc44-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.213558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab66fedc-1151-429b-9301-14e20af0cc44-kube-api-access-rnr2q" (OuterVolumeSpecName: "kube-api-access-rnr2q") pod "ab66fedc-1151-429b-9301-14e20af0cc44" (UID: "ab66fedc-1151-429b-9301-14e20af0cc44"). InnerVolumeSpecName "kube-api-access-rnr2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.239764 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.260004 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.287879 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-be97-account-create-update-9xhqq" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.287880 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-be97-account-create-update-9xhqq" event={"ID":"ab66fedc-1151-429b-9301-14e20af0cc44","Type":"ContainerDied","Data":"8e5e34edaf5551ef66eeb6e8ab5615c63c94a807c13bbbec370e36d4294a09e7"} Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.287952 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e5e34edaf5551ef66eeb6e8ab5615c63c94a807c13bbbec370e36d4294a09e7" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.303468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lv5qs" event={"ID":"ecaa5958-eab2-4af0-b0ba-43d0c91517f1","Type":"ContainerDied","Data":"7aaae1a693d34ba4ae623a8cad201f818c70665c152bee2a5149c721c38ba142"} Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.303493 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lv5qs" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.303516 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aaae1a693d34ba4ae623a8cad201f818c70665c152bee2a5149c721c38ba142" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.306163 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b692e051-bc25-4039-ac2d-10d17a93a44a-operator-scripts\") pod \"b692e051-bc25-4039-ac2d-10d17a93a44a\" (UID: \"b692e051-bc25-4039-ac2d-10d17a93a44a\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.306404 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-operator-scripts\") pod \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\" (UID: \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.306456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-operator-scripts\") pod \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\" (UID: \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.306481 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6d4q\" (UniqueName: \"kubernetes.io/projected/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-kube-api-access-f6d4q\") pod \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\" (UID: \"354ad36f-b0fb-4520-b5ed-b9ca3a65579c\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.306527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-operator-scripts\") pod \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\" (UID: \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.306565 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khbzz\" (UniqueName: \"kubernetes.io/projected/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-kube-api-access-khbzz\") pod \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\" (UID: \"ecaa5958-eab2-4af0-b0ba-43d0c91517f1\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.306642 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44rf4\" (UniqueName: \"kubernetes.io/projected/b692e051-bc25-4039-ac2d-10d17a93a44a-kube-api-access-44rf4\") pod \"b692e051-bc25-4039-ac2d-10d17a93a44a\" (UID: \"b692e051-bc25-4039-ac2d-10d17a93a44a\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.306700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr4tj\" (UniqueName: \"kubernetes.io/projected/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-kube-api-access-hr4tj\") pod \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\" (UID: \"8c02071d-0a01-4d7d-bbd5-727544dd7fbe\") " Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.309786 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecaa5958-eab2-4af0-b0ba-43d0c91517f1" (UID: "ecaa5958-eab2-4af0-b0ba-43d0c91517f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.310568 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b692e051-bc25-4039-ac2d-10d17a93a44a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b692e051-bc25-4039-ac2d-10d17a93a44a" (UID: "b692e051-bc25-4039-ac2d-10d17a93a44a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.311153 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "354ad36f-b0fb-4520-b5ed-b9ca3a65579c" (UID: "354ad36f-b0fb-4520-b5ed-b9ca3a65579c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.311317 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-kube-api-access-f6d4q" (OuterVolumeSpecName: "kube-api-access-f6d4q") pod "354ad36f-b0fb-4520-b5ed-b9ca3a65579c" (UID: "354ad36f-b0fb-4520-b5ed-b9ca3a65579c"). InnerVolumeSpecName "kube-api-access-f6d4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.311430 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c02071d-0a01-4d7d-bbd5-727544dd7fbe" (UID: "8c02071d-0a01-4d7d-bbd5-727544dd7fbe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.312685 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-kube-api-access-khbzz" (OuterVolumeSpecName: "kube-api-access-khbzz") pod "ecaa5958-eab2-4af0-b0ba-43d0c91517f1" (UID: "ecaa5958-eab2-4af0-b0ba-43d0c91517f1"). InnerVolumeSpecName "kube-api-access-khbzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313096 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313114 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313126 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6d4q\" (UniqueName: \"kubernetes.io/projected/354ad36f-b0fb-4520-b5ed-b9ca3a65579c-kube-api-access-f6d4q\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313135 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313145 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khbzz\" (UniqueName: \"kubernetes.io/projected/ecaa5958-eab2-4af0-b0ba-43d0c91517f1-kube-api-access-khbzz\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313154 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnr2q\" (UniqueName: \"kubernetes.io/projected/ab66fedc-1151-429b-9301-14e20af0cc44-kube-api-access-rnr2q\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313165 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b692e051-bc25-4039-ac2d-10d17a93a44a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nxpnp" event={"ID":"0532548b-3818-4f8f-bf72-c7ba529b4a8c","Type":"ContainerDied","Data":"b6df83ab5ab1b25040840061b0daf97c8e6de7767216886f6b28e06d95f223bd"} Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313583 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6df83ab5ab1b25040840061b0daf97c8e6de7767216886f6b28e06d95f223bd" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.313676 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nxpnp" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.318993 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b692e051-bc25-4039-ac2d-10d17a93a44a-kube-api-access-44rf4" (OuterVolumeSpecName: "kube-api-access-44rf4") pod "b692e051-bc25-4039-ac2d-10d17a93a44a" (UID: "b692e051-bc25-4039-ac2d-10d17a93a44a"). InnerVolumeSpecName "kube-api-access-44rf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.319281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-392d-account-create-update-cvlg4" event={"ID":"8c02071d-0a01-4d7d-bbd5-727544dd7fbe","Type":"ContainerDied","Data":"56d757ce6348281cea0942eea0ece47b5106bb5e19cbf29a541e54b126b1562f"} Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.319316 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d757ce6348281cea0942eea0ece47b5106bb5e19cbf29a541e54b126b1562f" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.319383 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-392d-account-create-update-cvlg4" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.327286 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-kube-api-access-hr4tj" (OuterVolumeSpecName: "kube-api-access-hr4tj") pod "8c02071d-0a01-4d7d-bbd5-727544dd7fbe" (UID: "8c02071d-0a01-4d7d-bbd5-727544dd7fbe"). InnerVolumeSpecName "kube-api-access-hr4tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.339473 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.339533 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-95c8-account-create-update-8vlkw" event={"ID":"b692e051-bc25-4039-ac2d-10d17a93a44a","Type":"ContainerDied","Data":"ffa4095d4f8ac7b7e295ec1f4da0fb9870192b7238732f2002edca775746bc0b"} Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.339571 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffa4095d4f8ac7b7e295ec1f4da0fb9870192b7238732f2002edca775746bc0b" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.346170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wkklm" event={"ID":"354ad36f-b0fb-4520-b5ed-b9ca3a65579c","Type":"ContainerDied","Data":"cb39a822f634e37b6719035bd47681f7837f265b07e4883a7a87bf6605302e08"} Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.346226 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb39a822f634e37b6719035bd47681f7837f265b07e4883a7a87bf6605302e08" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.346297 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wkklm" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.415253 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44rf4\" (UniqueName: \"kubernetes.io/projected/b692e051-bc25-4039-ac2d-10d17a93a44a-kube-api-access-44rf4\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:54.415287 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr4tj\" (UniqueName: \"kubernetes.io/projected/8c02071d-0a01-4d7d-bbd5-727544dd7fbe-kube-api-access-hr4tj\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:55 crc kubenswrapper[4743]: I0310 15:28:55.646507 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 10 15:28:57 crc kubenswrapper[4743]: I0310 15:28:57.442208 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:57 crc kubenswrapper[4743]: I0310 15:28:57.442608 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:57 crc kubenswrapper[4743]: I0310 15:28:57.472786 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:57 crc kubenswrapper[4743]: I0310 15:28:57.483128 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:58 crc kubenswrapper[4743]: I0310 15:28:58.382031 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:58 crc kubenswrapper[4743]: I0310 15:28:58.382086 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.250444 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.251064 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="ceilometer-central-agent" containerID="cri-o://edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8" gracePeriod=30 Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.251236 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="proxy-httpd" containerID="cri-o://74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706" gracePeriod=30 Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.251827 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="ceilometer-notification-agent" containerID="cri-o://6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f" gracePeriod=30 Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.251298 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="sg-core" containerID="cri-o://17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f" gracePeriod=30 Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.263919 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.200:3000/\": EOF" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.321684 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chr79"] Mar 10 15:29:00 crc kubenswrapper[4743]: E0310 15:29:00.322368 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c02071d-0a01-4d7d-bbd5-727544dd7fbe" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322387 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c02071d-0a01-4d7d-bbd5-727544dd7fbe" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: E0310 15:29:00.322400 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0532548b-3818-4f8f-bf72-c7ba529b4a8c" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322407 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0532548b-3818-4f8f-bf72-c7ba529b4a8c" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: E0310 15:29:00.322433 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab66fedc-1151-429b-9301-14e20af0cc44" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322439 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab66fedc-1151-429b-9301-14e20af0cc44" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: E0310 15:29:00.322450 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecaa5958-eab2-4af0-b0ba-43d0c91517f1" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322455 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecaa5958-eab2-4af0-b0ba-43d0c91517f1" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: E0310 15:29:00.322478 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354ad36f-b0fb-4520-b5ed-b9ca3a65579c" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322484 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="354ad36f-b0fb-4520-b5ed-b9ca3a65579c" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: E0310 15:29:00.322497 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b692e051-bc25-4039-ac2d-10d17a93a44a" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322503 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b692e051-bc25-4039-ac2d-10d17a93a44a" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322673 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0532548b-3818-4f8f-bf72-c7ba529b4a8c" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322693 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="354ad36f-b0fb-4520-b5ed-b9ca3a65579c" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322703 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c02071d-0a01-4d7d-bbd5-727544dd7fbe" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322721 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecaa5958-eab2-4af0-b0ba-43d0c91517f1" containerName="mariadb-database-create" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322735 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab66fedc-1151-429b-9301-14e20af0cc44" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.322745 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b692e051-bc25-4039-ac2d-10d17a93a44a" containerName="mariadb-account-create-update" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.323354 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.325505 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v7jvx" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.327720 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.327773 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.336778 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chr79"] Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.357032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.357188 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45vm\" (UniqueName: \"kubernetes.io/projected/aed9e7ad-ba13-436c-bf27-46d65eb60af3-kube-api-access-x45vm\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.357269 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-scripts\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.357336 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-config-data\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.418342 4743 generic.go:334] "Generic (PLEG): container finished" podID="0699eee8-2d25-4c52-827e-b287c47934f8" containerID="17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f" exitCode=2 Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.418395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerDied","Data":"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f"} Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.459607 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x45vm\" (UniqueName: \"kubernetes.io/projected/aed9e7ad-ba13-436c-bf27-46d65eb60af3-kube-api-access-x45vm\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.459689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-scripts\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.459737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-config-data\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.459788 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.467143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-scripts\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.467391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.472699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-config-data\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.487561 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45vm\" (UniqueName: \"kubernetes.io/projected/aed9e7ad-ba13-436c-bf27-46d65eb60af3-kube-api-access-x45vm\") pod \"nova-cell0-conductor-db-sync-chr79\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.542439 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.542564 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.551182 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.644541 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:00 crc kubenswrapper[4743]: I0310 15:29:00.747987 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7954db6464-ns5cf" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.199337 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chr79"] Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.209125 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.288709 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-combined-ca-bundle\") pod \"0699eee8-2d25-4c52-827e-b287c47934f8\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.288800 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-log-httpd\") pod \"0699eee8-2d25-4c52-827e-b287c47934f8\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.289094 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r2jq\" (UniqueName: \"kubernetes.io/projected/0699eee8-2d25-4c52-827e-b287c47934f8-kube-api-access-5r2jq\") pod \"0699eee8-2d25-4c52-827e-b287c47934f8\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.289163 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-run-httpd\") pod \"0699eee8-2d25-4c52-827e-b287c47934f8\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.289201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-sg-core-conf-yaml\") pod \"0699eee8-2d25-4c52-827e-b287c47934f8\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.289266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-config-data\") pod \"0699eee8-2d25-4c52-827e-b287c47934f8\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.289298 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-scripts\") pod \"0699eee8-2d25-4c52-827e-b287c47934f8\" (UID: \"0699eee8-2d25-4c52-827e-b287c47934f8\") " Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.293472 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0699eee8-2d25-4c52-827e-b287c47934f8" (UID: "0699eee8-2d25-4c52-827e-b287c47934f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.293634 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0699eee8-2d25-4c52-827e-b287c47934f8" (UID: "0699eee8-2d25-4c52-827e-b287c47934f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.296896 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-scripts" (OuterVolumeSpecName: "scripts") pod "0699eee8-2d25-4c52-827e-b287c47934f8" (UID: "0699eee8-2d25-4c52-827e-b287c47934f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.300168 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0699eee8-2d25-4c52-827e-b287c47934f8-kube-api-access-5r2jq" (OuterVolumeSpecName: "kube-api-access-5r2jq") pod "0699eee8-2d25-4c52-827e-b287c47934f8" (UID: "0699eee8-2d25-4c52-827e-b287c47934f8"). InnerVolumeSpecName "kube-api-access-5r2jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.325948 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0699eee8-2d25-4c52-827e-b287c47934f8" (UID: "0699eee8-2d25-4c52-827e-b287c47934f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.392170 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r2jq\" (UniqueName: \"kubernetes.io/projected/0699eee8-2d25-4c52-827e-b287c47934f8-kube-api-access-5r2jq\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.392228 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.392242 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.392254 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.392265 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0699eee8-2d25-4c52-827e-b287c47934f8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.411536 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0699eee8-2d25-4c52-827e-b287c47934f8" (UID: "0699eee8-2d25-4c52-827e-b287c47934f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433478 4743 generic.go:334] "Generic (PLEG): container finished" podID="0699eee8-2d25-4c52-827e-b287c47934f8" containerID="74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706" exitCode=0 Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433557 4743 generic.go:334] "Generic (PLEG): container finished" podID="0699eee8-2d25-4c52-827e-b287c47934f8" containerID="6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f" exitCode=0 Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433564 4743 generic.go:334] "Generic (PLEG): container finished" podID="0699eee8-2d25-4c52-827e-b287c47934f8" containerID="edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8" exitCode=0 Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerDied","Data":"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706"} Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433595 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433631 4743 scope.go:117] "RemoveContainer" containerID="74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerDied","Data":"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f"} Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433787 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerDied","Data":"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8"} Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.433807 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0699eee8-2d25-4c52-827e-b287c47934f8","Type":"ContainerDied","Data":"afa1d28e558d298911eb144b000018d62caf8d2a7f8e4867eba367f3865a1095"} Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.436797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-chr79" event={"ID":"aed9e7ad-ba13-436c-bf27-46d65eb60af3","Type":"ContainerStarted","Data":"871a892be92e4b4441ccba42e2f27bf47c82d0d3bc9c0528cdefffe9e37f5c69"} Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.454335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-config-data" (OuterVolumeSpecName: "config-data") pod "0699eee8-2d25-4c52-827e-b287c47934f8" (UID: "0699eee8-2d25-4c52-827e-b287c47934f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.493964 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.493999 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0699eee8-2d25-4c52-827e-b287c47934f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.522419 4743 scope.go:117] "RemoveContainer" containerID="17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.546930 4743 scope.go:117] "RemoveContainer" containerID="6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.574679 4743 scope.go:117] "RemoveContainer" containerID="edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.602648 4743 scope.go:117] "RemoveContainer" containerID="74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706" Mar 10 15:29:01 crc kubenswrapper[4743]: E0310 15:29:01.603288 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706\": container with ID starting with 74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706 not found: ID does not exist" containerID="74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.603332 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706"} err="failed to get container status \"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706\": rpc error: code = NotFound desc = could not find container \"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706\": container with ID starting with 74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706 not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.603357 4743 scope.go:117] "RemoveContainer" containerID="17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f" Mar 10 15:29:01 crc kubenswrapper[4743]: E0310 15:29:01.603979 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f\": container with ID starting with 17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f not found: ID does not exist" containerID="17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.604039 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f"} err="failed to get container status \"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f\": rpc error: code = NotFound desc = could not find container \"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f\": container with ID starting with 17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.604053 4743 scope.go:117] "RemoveContainer" containerID="6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f" Mar 10 15:29:01 crc kubenswrapper[4743]: E0310 15:29:01.604596 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f\": container with ID starting with 6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f not found: ID does not exist" containerID="6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.604642 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f"} err="failed to get container status \"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f\": rpc error: code = NotFound desc = could not find container \"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f\": container with ID starting with 6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.604682 4743 scope.go:117] "RemoveContainer" containerID="edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8" Mar 10 15:29:01 crc kubenswrapper[4743]: E0310 15:29:01.605323 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8\": container with ID starting with edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8 not found: ID does not exist" containerID="edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.605356 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8"} err="failed to get container status \"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8\": rpc error: code = NotFound desc = could not find container \"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8\": container with ID starting with edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8 not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.605687 4743 scope.go:117] "RemoveContainer" containerID="74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.606289 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706"} err="failed to get container status \"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706\": rpc error: code = NotFound desc = could not find container \"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706\": container with ID starting with 74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706 not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.606317 4743 scope.go:117] "RemoveContainer" containerID="17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.606760 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f"} err="failed to get container status \"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f\": rpc error: code = NotFound desc = could not find container \"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f\": container with ID starting with 17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.606832 4743 scope.go:117] "RemoveContainer" containerID="6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.607227 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f"} err="failed to get container status \"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f\": rpc error: code = NotFound desc = could not find container \"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f\": container with ID starting with 6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.607262 4743 scope.go:117] "RemoveContainer" containerID="edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.607915 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8"} err="failed to get container status \"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8\": rpc error: code = NotFound desc = could not find container \"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8\": container with ID starting with edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8 not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.607938 4743 scope.go:117] "RemoveContainer" containerID="74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.608265 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706"} err="failed to get container status \"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706\": rpc error: code = NotFound desc = could not find container \"74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706\": container with ID starting with 74814294041abd04594ebdbbef8930d88a192df640659368c2567f54cfcc8706 not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.608297 4743 scope.go:117] "RemoveContainer" containerID="17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.608781 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f"} err="failed to get container status \"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f\": rpc error: code = NotFound desc = could not find container \"17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f\": container with ID starting with 17d6576efbffdca42f9dd401858fc5e12c1f8da99dcffabe19684a5bb6d9495f not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.608825 4743 scope.go:117] "RemoveContainer" containerID="6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.609204 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f"} err="failed to get container status \"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f\": rpc error: code = NotFound desc = could not find container \"6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f\": container with ID starting with 6cbd5b1d0b250227f7eb1f24305f8d8aae65cc9848333ca253682ae3eaa60e7f not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.609229 4743 scope.go:117] "RemoveContainer" containerID="edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.610104 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8"} err="failed to get container status \"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8\": rpc error: code = NotFound desc = could not find container \"edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8\": container with ID starting with edb9801f020a399053ac73e14f40519831d9545b6f34274788adf455aee8b2b8 not found: ID does not exist" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.767344 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.778555 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.796922 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:01 crc kubenswrapper[4743]: E0310 15:29:01.800421 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="sg-core" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.800453 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="sg-core" Mar 10 15:29:01 crc kubenswrapper[4743]: E0310 15:29:01.800462 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="ceilometer-notification-agent" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.800468 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="ceilometer-notification-agent" Mar 10 15:29:01 crc kubenswrapper[4743]: E0310 15:29:01.800494 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="proxy-httpd" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.800500 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="proxy-httpd" Mar 10 15:29:01 crc kubenswrapper[4743]: E0310 15:29:01.800532 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="ceilometer-central-agent" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.800538 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="ceilometer-central-agent" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.801206 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="ceilometer-central-agent" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.801227 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="ceilometer-notification-agent" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.801252 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="sg-core" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.801271 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" containerName="proxy-httpd" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.803407 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.816736 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.839100 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.839458 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.902763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-scripts\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.902831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-config-data\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.903155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-log-httpd\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.903261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.903349 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-run-httpd\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.903475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.903578 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pkjq\" (UniqueName: \"kubernetes.io/projected/635e30a3-e32c-460c-bb1e-f7360e49d3e2-kube-api-access-4pkjq\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:01 crc kubenswrapper[4743]: I0310 15:29:01.926042 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0699eee8-2d25-4c52-827e-b287c47934f8" path="/var/lib/kubelet/pods/0699eee8-2d25-4c52-827e-b287c47934f8/volumes" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.006227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-log-httpd\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.006294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.006324 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-run-httpd\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.006370 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.006424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pkjq\" (UniqueName: \"kubernetes.io/projected/635e30a3-e32c-460c-bb1e-f7360e49d3e2-kube-api-access-4pkjq\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.006899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-scripts\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.007364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-config-data\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.007721 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-log-httpd\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.008015 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-run-httpd\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.011420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.011454 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-scripts\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.011992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-config-data\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.013575 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.027912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pkjq\" (UniqueName: \"kubernetes.io/projected/635e30a3-e32c-460c-bb1e-f7360e49d3e2-kube-api-access-4pkjq\") pod \"ceilometer-0\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.161378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:02 crc kubenswrapper[4743]: I0310 15:29:02.694671 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:03 crc kubenswrapper[4743]: I0310 15:29:03.461748 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerStarted","Data":"39b90851bfb04f3bb0581746e7fb38fdbeb8b536b3e4c5da6f3d771738e9b254"} Mar 10 15:29:03 crc kubenswrapper[4743]: I0310 15:29:03.462474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerStarted","Data":"9c6c999d823858c5c9cc4f7587015346e2776097be972d59297682a05e62c80a"} Mar 10 15:29:04 crc kubenswrapper[4743]: I0310 15:29:04.472490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerStarted","Data":"015a8c645b065e2838156741ec1266d428e571877fffabb2c45604444bb15596"} Mar 10 15:29:07 crc kubenswrapper[4743]: I0310 15:29:07.455901 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:07 crc kubenswrapper[4743]: I0310 15:29:07.512975 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0001988-feba-4afe-9068-071af12a6fd7" containerID="1a4a17b7c6b5e58ecc54899710b1a069b044db12fac0070f22bb4b2126726a01" exitCode=137 Mar 10 15:29:07 crc kubenswrapper[4743]: I0310 15:29:07.513034 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7954db6464-ns5cf" event={"ID":"c0001988-feba-4afe-9068-071af12a6fd7","Type":"ContainerDied","Data":"1a4a17b7c6b5e58ecc54899710b1a069b044db12fac0070f22bb4b2126726a01"} Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.514550 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.540290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7954db6464-ns5cf" event={"ID":"c0001988-feba-4afe-9068-071af12a6fd7","Type":"ContainerDied","Data":"5d57ec9d60f3343ab1f3ba5e638d1d3474bbd96f2b84d503fe676d5e894917e6"} Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.540353 4743 scope.go:117] "RemoveContainer" containerID="f0661b001f1e4aa05e793a6f1f5306c28f23ea2a63110c85c0439180d4cfb904" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.540367 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7954db6464-ns5cf" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.694712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xssfx\" (UniqueName: \"kubernetes.io/projected/c0001988-feba-4afe-9068-071af12a6fd7-kube-api-access-xssfx\") pod \"c0001988-feba-4afe-9068-071af12a6fd7\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.694834 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-config-data\") pod \"c0001988-feba-4afe-9068-071af12a6fd7\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.695005 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-tls-certs\") pod \"c0001988-feba-4afe-9068-071af12a6fd7\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.695089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-scripts\") pod \"c0001988-feba-4afe-9068-071af12a6fd7\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.695152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-secret-key\") pod \"c0001988-feba-4afe-9068-071af12a6fd7\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.695204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0001988-feba-4afe-9068-071af12a6fd7-logs\") pod \"c0001988-feba-4afe-9068-071af12a6fd7\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.695224 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-combined-ca-bundle\") pod \"c0001988-feba-4afe-9068-071af12a6fd7\" (UID: \"c0001988-feba-4afe-9068-071af12a6fd7\") " Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.695835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0001988-feba-4afe-9068-071af12a6fd7-logs" (OuterVolumeSpecName: "logs") pod "c0001988-feba-4afe-9068-071af12a6fd7" (UID: "c0001988-feba-4afe-9068-071af12a6fd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.701133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0001988-feba-4afe-9068-071af12a6fd7-kube-api-access-xssfx" (OuterVolumeSpecName: "kube-api-access-xssfx") pod "c0001988-feba-4afe-9068-071af12a6fd7" (UID: "c0001988-feba-4afe-9068-071af12a6fd7"). InnerVolumeSpecName "kube-api-access-xssfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.701510 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c0001988-feba-4afe-9068-071af12a6fd7" (UID: "c0001988-feba-4afe-9068-071af12a6fd7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.725285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-scripts" (OuterVolumeSpecName: "scripts") pod "c0001988-feba-4afe-9068-071af12a6fd7" (UID: "c0001988-feba-4afe-9068-071af12a6fd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.727429 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-config-data" (OuterVolumeSpecName: "config-data") pod "c0001988-feba-4afe-9068-071af12a6fd7" (UID: "c0001988-feba-4afe-9068-071af12a6fd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.728622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0001988-feba-4afe-9068-071af12a6fd7" (UID: "c0001988-feba-4afe-9068-071af12a6fd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.732039 4743 scope.go:117] "RemoveContainer" containerID="1a4a17b7c6b5e58ecc54899710b1a069b044db12fac0070f22bb4b2126726a01" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.750295 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "c0001988-feba-4afe-9068-071af12a6fd7" (UID: "c0001988-feba-4afe-9068-071af12a6fd7"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.797803 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xssfx\" (UniqueName: \"kubernetes.io/projected/c0001988-feba-4afe-9068-071af12a6fd7-kube-api-access-xssfx\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.797863 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.797879 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.797891 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0001988-feba-4afe-9068-071af12a6fd7-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.797910 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.797921 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0001988-feba-4afe-9068-071af12a6fd7-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.797933 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0001988-feba-4afe-9068-071af12a6fd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.884313 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7954db6464-ns5cf"] Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.894714 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7954db6464-ns5cf"] Mar 10 15:29:09 crc kubenswrapper[4743]: I0310 15:29:09.943012 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0001988-feba-4afe-9068-071af12a6fd7" path="/var/lib/kubelet/pods/c0001988-feba-4afe-9068-071af12a6fd7/volumes" Mar 10 15:29:10 crc kubenswrapper[4743]: I0310 15:29:10.552852 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerStarted","Data":"6359705c3fbd30affc67c5043ef3f7593149e6039bb5e93b44882eaf8ea6e879"} Mar 10 15:29:10 crc kubenswrapper[4743]: I0310 15:29:10.560255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-chr79" event={"ID":"aed9e7ad-ba13-436c-bf27-46d65eb60af3","Type":"ContainerStarted","Data":"a8dd7b460bd2e7414390c8d7085a0dcb6b84aac52fe5cdeda52c284f014e06be"} Mar 10 15:29:10 crc kubenswrapper[4743]: I0310 15:29:10.588403 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-chr79" podStartSLOduration=2.549134311 podStartE2EDuration="10.588378092s" podCreationTimestamp="2026-03-10 15:29:00 +0000 UTC" firstStartedPulling="2026-03-10 15:29:01.221578403 +0000 UTC m=+1405.928393151" lastFinishedPulling="2026-03-10 15:29:09.260822184 +0000 UTC m=+1413.967636932" observedRunningTime="2026-03-10 15:29:10.579199969 +0000 UTC m=+1415.286014717" watchObservedRunningTime="2026-03-10 15:29:10.588378092 +0000 UTC m=+1415.295192830" Mar 10 15:29:11 crc kubenswrapper[4743]: I0310 15:29:11.252505 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:29:11 crc kubenswrapper[4743]: I0310 15:29:11.252588 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:29:13 crc kubenswrapper[4743]: I0310 15:29:13.599269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerStarted","Data":"d8908dfd593ad3fa6987fa7a3f861bccf4eac533c5ea2a3f033ea4c287d372eb"} Mar 10 15:29:13 crc kubenswrapper[4743]: I0310 15:29:13.600049 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="ceilometer-central-agent" containerID="cri-o://39b90851bfb04f3bb0581746e7fb38fdbeb8b536b3e4c5da6f3d771738e9b254" gracePeriod=30 Mar 10 15:29:13 crc kubenswrapper[4743]: I0310 15:29:13.600140 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:29:13 crc kubenswrapper[4743]: I0310 15:29:13.600596 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="proxy-httpd" containerID="cri-o://d8908dfd593ad3fa6987fa7a3f861bccf4eac533c5ea2a3f033ea4c287d372eb" gracePeriod=30 Mar 10 15:29:13 crc kubenswrapper[4743]: I0310 15:29:13.600641 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="sg-core" containerID="cri-o://6359705c3fbd30affc67c5043ef3f7593149e6039bb5e93b44882eaf8ea6e879" gracePeriod=30 Mar 10 15:29:13 crc kubenswrapper[4743]: I0310 15:29:13.600678 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="ceilometer-notification-agent" containerID="cri-o://015a8c645b065e2838156741ec1266d428e571877fffabb2c45604444bb15596" gracePeriod=30 Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.617274 4743 generic.go:334] "Generic (PLEG): container finished" podID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerID="d8908dfd593ad3fa6987fa7a3f861bccf4eac533c5ea2a3f033ea4c287d372eb" exitCode=0 Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.618081 4743 generic.go:334] "Generic (PLEG): container finished" podID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerID="6359705c3fbd30affc67c5043ef3f7593149e6039bb5e93b44882eaf8ea6e879" exitCode=2 Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.618113 4743 generic.go:334] "Generic (PLEG): container finished" podID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerID="015a8c645b065e2838156741ec1266d428e571877fffabb2c45604444bb15596" exitCode=0 Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.618135 4743 generic.go:334] "Generic (PLEG): container finished" podID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerID="39b90851bfb04f3bb0581746e7fb38fdbeb8b536b3e4c5da6f3d771738e9b254" exitCode=0 Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.618176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerDied","Data":"d8908dfd593ad3fa6987fa7a3f861bccf4eac533c5ea2a3f033ea4c287d372eb"} Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.618228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerDied","Data":"6359705c3fbd30affc67c5043ef3f7593149e6039bb5e93b44882eaf8ea6e879"} Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.618258 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerDied","Data":"015a8c645b065e2838156741ec1266d428e571877fffabb2c45604444bb15596"} Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.618281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerDied","Data":"39b90851bfb04f3bb0581746e7fb38fdbeb8b536b3e4c5da6f3d771738e9b254"} Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.786963 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.833466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-sg-core-conf-yaml\") pod \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.833570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pkjq\" (UniqueName: \"kubernetes.io/projected/635e30a3-e32c-460c-bb1e-f7360e49d3e2-kube-api-access-4pkjq\") pod \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.833640 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-run-httpd\") pod \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.833708 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-combined-ca-bundle\") pod \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.834048 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-log-httpd\") pod \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.834290 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-scripts\") pod \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.834429 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-config-data\") pod \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\" (UID: \"635e30a3-e32c-460c-bb1e-f7360e49d3e2\") " Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.835397 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "635e30a3-e32c-460c-bb1e-f7360e49d3e2" (UID: "635e30a3-e32c-460c-bb1e-f7360e49d3e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.836754 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "635e30a3-e32c-460c-bb1e-f7360e49d3e2" (UID: "635e30a3-e32c-460c-bb1e-f7360e49d3e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.841699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-scripts" (OuterVolumeSpecName: "scripts") pod "635e30a3-e32c-460c-bb1e-f7360e49d3e2" (UID: "635e30a3-e32c-460c-bb1e-f7360e49d3e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.856303 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635e30a3-e32c-460c-bb1e-f7360e49d3e2-kube-api-access-4pkjq" (OuterVolumeSpecName: "kube-api-access-4pkjq") pod "635e30a3-e32c-460c-bb1e-f7360e49d3e2" (UID: "635e30a3-e32c-460c-bb1e-f7360e49d3e2"). InnerVolumeSpecName "kube-api-access-4pkjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.865029 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "635e30a3-e32c-460c-bb1e-f7360e49d3e2" (UID: "635e30a3-e32c-460c-bb1e-f7360e49d3e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.929974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "635e30a3-e32c-460c-bb1e-f7360e49d3e2" (UID: "635e30a3-e32c-460c-bb1e-f7360e49d3e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.937021 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pkjq\" (UniqueName: \"kubernetes.io/projected/635e30a3-e32c-460c-bb1e-f7360e49d3e2-kube-api-access-4pkjq\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.937060 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.937071 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.937081 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/635e30a3-e32c-460c-bb1e-f7360e49d3e2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.937088 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.937096 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:14 crc kubenswrapper[4743]: I0310 15:29:14.946323 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-config-data" (OuterVolumeSpecName: "config-data") pod "635e30a3-e32c-460c-bb1e-f7360e49d3e2" (UID: "635e30a3-e32c-460c-bb1e-f7360e49d3e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.039603 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e30a3-e32c-460c-bb1e-f7360e49d3e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.630247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"635e30a3-e32c-460c-bb1e-f7360e49d3e2","Type":"ContainerDied","Data":"9c6c999d823858c5c9cc4f7587015346e2776097be972d59297682a05e62c80a"} Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.630316 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.630594 4743 scope.go:117] "RemoveContainer" containerID="d8908dfd593ad3fa6987fa7a3f861bccf4eac533c5ea2a3f033ea4c287d372eb" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.653503 4743 scope.go:117] "RemoveContainer" containerID="6359705c3fbd30affc67c5043ef3f7593149e6039bb5e93b44882eaf8ea6e879" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.667621 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.676352 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.694782 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:15 crc kubenswrapper[4743]: E0310 15:29:15.695295 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695324 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" Mar 10 15:29:15 crc kubenswrapper[4743]: E0310 15:29:15.695345 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon-log" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695354 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon-log" Mar 10 15:29:15 crc kubenswrapper[4743]: E0310 15:29:15.695370 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695380 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" Mar 10 15:29:15 crc kubenswrapper[4743]: E0310 15:29:15.695412 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="sg-core" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695423 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="sg-core" Mar 10 15:29:15 crc kubenswrapper[4743]: E0310 15:29:15.695436 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="ceilometer-notification-agent" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695444 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="ceilometer-notification-agent" Mar 10 15:29:15 crc kubenswrapper[4743]: E0310 15:29:15.695462 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="ceilometer-central-agent" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695471 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="ceilometer-central-agent" Mar 10 15:29:15 crc kubenswrapper[4743]: E0310 15:29:15.695501 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="proxy-httpd" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695509 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="proxy-httpd" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695689 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="ceilometer-central-agent" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695704 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon-log" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695714 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695725 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="ceilometer-notification-agent" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695737 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="proxy-httpd" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.695752 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" containerName="sg-core" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.696160 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0001988-feba-4afe-9068-071af12a6fd7" containerName="horizon" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.697779 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.706964 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.706964 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.715812 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.775265 4743 scope.go:117] "RemoveContainer" containerID="015a8c645b065e2838156741ec1266d428e571877fffabb2c45604444bb15596" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.805687 4743 scope.go:117] "RemoveContainer" containerID="39b90851bfb04f3bb0581746e7fb38fdbeb8b536b3e4c5da6f3d771738e9b254" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.868516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.868605 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-scripts\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.868623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-log-httpd\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.868664 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.868686 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8sm\" (UniqueName: \"kubernetes.io/projected/d4ce59fe-b30d-42e1-a49f-5108fec386c9-kube-api-access-4m8sm\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.868741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-run-httpd\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.868769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-config-data\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.926468 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="635e30a3-e32c-460c-bb1e-f7360e49d3e2" path="/var/lib/kubelet/pods/635e30a3-e32c-460c-bb1e-f7360e49d3e2/volumes" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.970531 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.971423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-log-httpd\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.971454 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-scripts\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.971496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.971524 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m8sm\" (UniqueName: \"kubernetes.io/projected/d4ce59fe-b30d-42e1-a49f-5108fec386c9-kube-api-access-4m8sm\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.971587 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-run-httpd\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.971615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-config-data\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.974939 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-log-httpd\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.975749 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-run-httpd\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.975946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.979447 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-config-data\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.986974 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-scripts\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.990284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:15 crc kubenswrapper[4743]: I0310 15:29:15.996934 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m8sm\" (UniqueName: \"kubernetes.io/projected/d4ce59fe-b30d-42e1-a49f-5108fec386c9-kube-api-access-4m8sm\") pod \"ceilometer-0\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " pod="openstack/ceilometer-0" Mar 10 15:29:16 crc kubenswrapper[4743]: I0310 15:29:16.078328 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:16 crc kubenswrapper[4743]: I0310 15:29:16.578232 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:16 crc kubenswrapper[4743]: I0310 15:29:16.640248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerStarted","Data":"29920f485f44ca3b2c8d042f9ce0c7807ffc11a117cda62d31cfe1ba0a9ae62c"} Mar 10 15:29:18 crc kubenswrapper[4743]: I0310 15:29:18.662802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerStarted","Data":"a1961592dd52d3e0efe5ae8ab07de8d9689ab192d3e5fdc7c0e020a1c8a10958"} Mar 10 15:29:18 crc kubenswrapper[4743]: I0310 15:29:18.663403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerStarted","Data":"4a15918033e02909941377b979e69147db358742340567004d2dec3ebbf02d5a"} Mar 10 15:29:19 crc kubenswrapper[4743]: I0310 15:29:19.673218 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerStarted","Data":"1aca20ffe56168504c3fb352e4f749f0347b8838515c117b285546d7bd6dcb1e"} Mar 10 15:29:21 crc kubenswrapper[4743]: I0310 15:29:21.693714 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerStarted","Data":"3cc967ecf33e219e2e63415a97df2f76041124deffcd702d924189341f92de63"} Mar 10 15:29:21 crc kubenswrapper[4743]: I0310 15:29:21.694320 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:29:21 crc kubenswrapper[4743]: I0310 15:29:21.717210 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.528507522 podStartE2EDuration="6.7171896s" podCreationTimestamp="2026-03-10 15:29:15 +0000 UTC" firstStartedPulling="2026-03-10 15:29:16.5897741 +0000 UTC m=+1421.296588848" lastFinishedPulling="2026-03-10 15:29:20.778456178 +0000 UTC m=+1425.485270926" observedRunningTime="2026-03-10 15:29:21.714080321 +0000 UTC m=+1426.420895069" watchObservedRunningTime="2026-03-10 15:29:21.7171896 +0000 UTC m=+1426.424004348" Mar 10 15:29:32 crc kubenswrapper[4743]: I0310 15:29:32.829942 4743 generic.go:334] "Generic (PLEG): container finished" podID="aed9e7ad-ba13-436c-bf27-46d65eb60af3" containerID="a8dd7b460bd2e7414390c8d7085a0dcb6b84aac52fe5cdeda52c284f014e06be" exitCode=0 Mar 10 15:29:32 crc kubenswrapper[4743]: I0310 15:29:32.830079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-chr79" event={"ID":"aed9e7ad-ba13-436c-bf27-46d65eb60af3","Type":"ContainerDied","Data":"a8dd7b460bd2e7414390c8d7085a0dcb6b84aac52fe5cdeda52c284f014e06be"} Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.311515 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.392609 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-scripts\") pod \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.392653 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-combined-ca-bundle\") pod \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.392707 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x45vm\" (UniqueName: \"kubernetes.io/projected/aed9e7ad-ba13-436c-bf27-46d65eb60af3-kube-api-access-x45vm\") pod \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.392876 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-config-data\") pod \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\" (UID: \"aed9e7ad-ba13-436c-bf27-46d65eb60af3\") " Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.400501 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-scripts" (OuterVolumeSpecName: "scripts") pod "aed9e7ad-ba13-436c-bf27-46d65eb60af3" (UID: "aed9e7ad-ba13-436c-bf27-46d65eb60af3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.400900 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed9e7ad-ba13-436c-bf27-46d65eb60af3-kube-api-access-x45vm" (OuterVolumeSpecName: "kube-api-access-x45vm") pod "aed9e7ad-ba13-436c-bf27-46d65eb60af3" (UID: "aed9e7ad-ba13-436c-bf27-46d65eb60af3"). InnerVolumeSpecName "kube-api-access-x45vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.424988 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-config-data" (OuterVolumeSpecName: "config-data") pod "aed9e7ad-ba13-436c-bf27-46d65eb60af3" (UID: "aed9e7ad-ba13-436c-bf27-46d65eb60af3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.427730 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aed9e7ad-ba13-436c-bf27-46d65eb60af3" (UID: "aed9e7ad-ba13-436c-bf27-46d65eb60af3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.495615 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.495651 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.495663 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x45vm\" (UniqueName: \"kubernetes.io/projected/aed9e7ad-ba13-436c-bf27-46d65eb60af3-kube-api-access-x45vm\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.495671 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed9e7ad-ba13-436c-bf27-46d65eb60af3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.856671 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-chr79" event={"ID":"aed9e7ad-ba13-436c-bf27-46d65eb60af3","Type":"ContainerDied","Data":"871a892be92e4b4441ccba42e2f27bf47c82d0d3bc9c0528cdefffe9e37f5c69"} Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.856720 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="871a892be92e4b4441ccba42e2f27bf47c82d0d3bc9c0528cdefffe9e37f5c69" Mar 10 15:29:34 crc kubenswrapper[4743]: I0310 15:29:34.856721 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-chr79" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.040458 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 15:29:35 crc kubenswrapper[4743]: E0310 15:29:35.041105 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed9e7ad-ba13-436c-bf27-46d65eb60af3" containerName="nova-cell0-conductor-db-sync" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.041141 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed9e7ad-ba13-436c-bf27-46d65eb60af3" containerName="nova-cell0-conductor-db-sync" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.041483 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed9e7ad-ba13-436c-bf27-46d65eb60af3" containerName="nova-cell0-conductor-db-sync" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.044247 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.047506 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.047970 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v7jvx" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.052614 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.112120 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcc9336-67de-48d7-8af4-bfb56233b69c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.112404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgrnl\" (UniqueName: \"kubernetes.io/projected/fbcc9336-67de-48d7-8af4-bfb56233b69c-kube-api-access-wgrnl\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.112451 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcc9336-67de-48d7-8af4-bfb56233b69c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.214972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgrnl\" (UniqueName: \"kubernetes.io/projected/fbcc9336-67de-48d7-8af4-bfb56233b69c-kube-api-access-wgrnl\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.215045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcc9336-67de-48d7-8af4-bfb56233b69c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.215131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcc9336-67de-48d7-8af4-bfb56233b69c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.220922 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcc9336-67de-48d7-8af4-bfb56233b69c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.221096 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcc9336-67de-48d7-8af4-bfb56233b69c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.231795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgrnl\" (UniqueName: \"kubernetes.io/projected/fbcc9336-67de-48d7-8af4-bfb56233b69c-kube-api-access-wgrnl\") pod \"nova-cell0-conductor-0\" (UID: \"fbcc9336-67de-48d7-8af4-bfb56233b69c\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.372149 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.846872 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 15:29:35 crc kubenswrapper[4743]: W0310 15:29:35.855109 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbcc9336_67de_48d7_8af4_bfb56233b69c.slice/crio-7d0da1d71363b2cf78dc47bd5037819f993676f8f9b5d9a655972c237c141444 WatchSource:0}: Error finding container 7d0da1d71363b2cf78dc47bd5037819f993676f8f9b5d9a655972c237c141444: Status 404 returned error can't find the container with id 7d0da1d71363b2cf78dc47bd5037819f993676f8f9b5d9a655972c237c141444 Mar 10 15:29:35 crc kubenswrapper[4743]: I0310 15:29:35.871104 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fbcc9336-67de-48d7-8af4-bfb56233b69c","Type":"ContainerStarted","Data":"7d0da1d71363b2cf78dc47bd5037819f993676f8f9b5d9a655972c237c141444"} Mar 10 15:29:36 crc kubenswrapper[4743]: I0310 15:29:36.882228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fbcc9336-67de-48d7-8af4-bfb56233b69c","Type":"ContainerStarted","Data":"2dfef58818e959c73711751883d0f9738b7b516ee153f5edef44587095a35d2f"} Mar 10 15:29:36 crc kubenswrapper[4743]: I0310 15:29:36.882696 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:36 crc kubenswrapper[4743]: I0310 15:29:36.912631 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.912605942 podStartE2EDuration="1.912605942s" podCreationTimestamp="2026-03-10 15:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:36.90626355 +0000 UTC m=+1441.613078318" watchObservedRunningTime="2026-03-10 15:29:36.912605942 +0000 UTC m=+1441.619420710" Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.252419 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.253107 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.253186 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.254227 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd61460384d6cf2cf3d97e1581a2275d487c44dd87d687ca072c07eb9d139f79"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.254313 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://fd61460384d6cf2cf3d97e1581a2275d487c44dd87d687ca072c07eb9d139f79" gracePeriod=600 Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.939190 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="fd61460384d6cf2cf3d97e1581a2275d487c44dd87d687ca072c07eb9d139f79" exitCode=0 Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.939255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"fd61460384d6cf2cf3d97e1581a2275d487c44dd87d687ca072c07eb9d139f79"} Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.940196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d"} Mar 10 15:29:41 crc kubenswrapper[4743]: I0310 15:29:41.940300 4743 scope.go:117] "RemoveContainer" containerID="e87aabce4b79954ca871a938cf484c77623556f05115d13359a5bd9f0c4154c7" Mar 10 15:29:45 crc kubenswrapper[4743]: I0310 15:29:45.422687 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 15:29:45 crc kubenswrapper[4743]: I0310 15:29:45.949249 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-l2xjq"] Mar 10 15:29:45 crc kubenswrapper[4743]: I0310 15:29:45.951066 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:45 crc kubenswrapper[4743]: I0310 15:29:45.954954 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 15:29:45 crc kubenswrapper[4743]: I0310 15:29:45.956680 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 15:29:45 crc kubenswrapper[4743]: I0310 15:29:45.969461 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-l2xjq"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.071297 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbx6\" (UniqueName: \"kubernetes.io/projected/67f18369-c26d-4389-a49c-e0178c8d6db3-kube-api-access-5cbx6\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.071412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.071565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-scripts\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.071770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-config-data\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.159276 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.173772 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbx6\" (UniqueName: \"kubernetes.io/projected/67f18369-c26d-4389-a49c-e0178c8d6db3-kube-api-access-5cbx6\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.173888 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.173926 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-scripts\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.174100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-config-data\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.180623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-config-data\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.182397 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-scripts\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.186582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.201881 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.203261 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.230304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbx6\" (UniqueName: \"kubernetes.io/projected/67f18369-c26d-4389-a49c-e0178c8d6db3-kube-api-access-5cbx6\") pod \"nova-cell0-cell-mapping-l2xjq\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.257394 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.266434 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.275745 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.275955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-config-data\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.276053 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7zw7\" (UniqueName: \"kubernetes.io/projected/0cfcb383-10fc-4149-af2f-fe50f3003f05-kube-api-access-j7zw7\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.288551 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.316929 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.318341 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.325865 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.336166 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.390289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65nwx\" (UniqueName: \"kubernetes.io/projected/98838081-e32b-46ae-b757-72abda3f9737-kube-api-access-65nwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.390418 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-config-data\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.390476 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.390515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7zw7\" (UniqueName: \"kubernetes.io/projected/0cfcb383-10fc-4149-af2f-fe50f3003f05-kube-api-access-j7zw7\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.390579 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.390692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.401713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.427749 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7zw7\" (UniqueName: \"kubernetes.io/projected/0cfcb383-10fc-4149-af2f-fe50f3003f05-kube-api-access-j7zw7\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.428528 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-config-data\") pod \"nova-scheduler-0\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.494128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65nwx\" (UniqueName: \"kubernetes.io/projected/98838081-e32b-46ae-b757-72abda3f9737-kube-api-access-65nwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.494581 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.494678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.503979 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.522804 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.537093 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.557540 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65nwx\" (UniqueName: \"kubernetes.io/projected/98838081-e32b-46ae-b757-72abda3f9737-kube-api-access-65nwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.569431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.574240 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.578269 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.633140 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.723292 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.723365 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ed08f17-0875-42b7-8764-d5589cf03e0b-logs\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.723487 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-config-data\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.723634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh59j\" (UniqueName: \"kubernetes.io/projected/3ed08f17-0875-42b7-8764-d5589cf03e0b-kube-api-access-nh59j\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.807943 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.816123 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.821247 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.825779 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.825832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ed08f17-0875-42b7-8764-d5589cf03e0b-logs\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.825903 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-config-data\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.825983 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh59j\" (UniqueName: \"kubernetes.io/projected/3ed08f17-0875-42b7-8764-d5589cf03e0b-kube-api-access-nh59j\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.826773 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.827359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ed08f17-0875-42b7-8764-d5589cf03e0b-logs\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.836601 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.839957 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-config-data\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.860541 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.865446 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh59j\" (UniqueName: \"kubernetes.io/projected/3ed08f17-0875-42b7-8764-d5589cf03e0b-kube-api-access-nh59j\") pod \"nova-metadata-0\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " pod="openstack/nova-metadata-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.901707 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-858594bc89-2zfwg"] Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.904260 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.929880 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-config-data\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.930027 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6361004a-5259-434e-a899-81704ebee56c-logs\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.930081 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9c59\" (UniqueName: \"kubernetes.io/projected/6361004a-5259-434e-a899-81704ebee56c-kube-api-access-j9c59\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.930114 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:46 crc kubenswrapper[4743]: I0310 15:29:46.933213 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-2zfwg"] Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.006296 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.032717 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrsx\" (UniqueName: \"kubernetes.io/projected/b5cf5878-917c-4aa7-b209-37045ef1dc13-kube-api-access-tsrsx\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.032788 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-sb\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.032941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-nb\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.033053 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-config\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.033130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-config-data\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.033159 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6361004a-5259-434e-a899-81704ebee56c-logs\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.033204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-swift-storage-0\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.033233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9c59\" (UniqueName: \"kubernetes.io/projected/6361004a-5259-434e-a899-81704ebee56c-kube-api-access-j9c59\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.033274 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.033315 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-svc\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.035757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6361004a-5259-434e-a899-81704ebee56c-logs\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.048931 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.052958 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-config-data\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.060974 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9c59\" (UniqueName: \"kubernetes.io/projected/6361004a-5259-434e-a899-81704ebee56c-kube-api-access-j9c59\") pod \"nova-api-0\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.134787 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-nb\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.134928 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-config\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.135010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-swift-storage-0\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.135049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-svc\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.135090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrsx\" (UniqueName: \"kubernetes.io/projected/b5cf5878-917c-4aa7-b209-37045ef1dc13-kube-api-access-tsrsx\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.135132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-sb\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.136196 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-nb\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.136208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-swift-storage-0\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.136254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-config\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.136393 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-svc\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.137702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-sb\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.158086 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrsx\" (UniqueName: \"kubernetes.io/projected/b5cf5878-917c-4aa7-b209-37045ef1dc13-kube-api-access-tsrsx\") pod \"dnsmasq-dns-858594bc89-2zfwg\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.162857 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.220771 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-l2xjq"] Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.253045 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.356453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.460874 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4rs44"] Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.463505 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.467628 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.468224 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.486850 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4rs44"] Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.544975 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.545006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-scripts\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.545072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.545156 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-config-data\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.545218 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dm5w\" (UniqueName: \"kubernetes.io/projected/8b077b1c-05fd-4ef3-8adb-42ff870116c9-kube-api-access-9dm5w\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.648281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.648758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-config-data\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.648811 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dm5w\" (UniqueName: \"kubernetes.io/projected/8b077b1c-05fd-4ef3-8adb-42ff870116c9-kube-api-access-9dm5w\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.648970 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-scripts\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.655800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-config-data\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.655917 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-scripts\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.656465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.673700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dm5w\" (UniqueName: \"kubernetes.io/projected/8b077b1c-05fd-4ef3-8adb-42ff870116c9-kube-api-access-9dm5w\") pod \"nova-cell1-conductor-db-sync-4rs44\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: W0310 15:29:47.679978 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ed08f17_0875_42b7_8764_d5589cf03e0b.slice/crio-e8ac037bb98ae007441d888ed9c63c6a731f5fad7b044e6d3f3999ad04095526 WatchSource:0}: Error finding container e8ac037bb98ae007441d888ed9c63c6a731f5fad7b044e6d3f3999ad04095526: Status 404 returned error can't find the container with id e8ac037bb98ae007441d888ed9c63c6a731f5fad7b044e6d3f3999ad04095526 Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.681242 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.806168 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.816946 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:47 crc kubenswrapper[4743]: W0310 15:29:47.858995 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6361004a_5259_434e_a899_81704ebee56c.slice/crio-d7d69a532823c2c9d51f44f104785b5d553b35e3bc40a2ac4d49edf8a4e1cc89 WatchSource:0}: Error finding container d7d69a532823c2c9d51f44f104785b5d553b35e3bc40a2ac4d49edf8a4e1cc89: Status 404 returned error can't find the container with id d7d69a532823c2c9d51f44f104785b5d553b35e3bc40a2ac4d49edf8a4e1cc89 Mar 10 15:29:47 crc kubenswrapper[4743]: I0310 15:29:47.986532 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-2zfwg"] Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.030030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"98838081-e32b-46ae-b757-72abda3f9737","Type":"ContainerStarted","Data":"5c4cc2b54ea06aab996e8938d05f039902f21a6e85ff01f8c7d92dcb72bc5a0a"} Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.031682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cfcb383-10fc-4149-af2f-fe50f3003f05","Type":"ContainerStarted","Data":"daf51e774f504146b02ee543f5741cd5af1970cfcd8bf5c6035c5a971afba747"} Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.042492 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ed08f17-0875-42b7-8764-d5589cf03e0b","Type":"ContainerStarted","Data":"e8ac037bb98ae007441d888ed9c63c6a731f5fad7b044e6d3f3999ad04095526"} Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.045010 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l2xjq" event={"ID":"67f18369-c26d-4389-a49c-e0178c8d6db3","Type":"ContainerStarted","Data":"3437e368284b6c57ea8906b6b04141fd77c8e95cea732ef59b968b2ee59909ac"} Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.045057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l2xjq" event={"ID":"67f18369-c26d-4389-a49c-e0178c8d6db3","Type":"ContainerStarted","Data":"f28d9e5ad44843928f4c9de5b2271b02aef8a1e7786c43f3f241f482079f7bd8"} Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.047189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" event={"ID":"b5cf5878-917c-4aa7-b209-37045ef1dc13","Type":"ContainerStarted","Data":"6044d5f9aec9a51fe6797ca7cef2df326356fc34b92c85cda23860411489852f"} Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.048818 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6361004a-5259-434e-a899-81704ebee56c","Type":"ContainerStarted","Data":"d7d69a532823c2c9d51f44f104785b5d553b35e3bc40a2ac4d49edf8a4e1cc89"} Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.064012 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-l2xjq" podStartSLOduration=3.063986945 podStartE2EDuration="3.063986945s" podCreationTimestamp="2026-03-10 15:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:48.057709996 +0000 UTC m=+1452.764524744" watchObservedRunningTime="2026-03-10 15:29:48.063986945 +0000 UTC m=+1452.770801693" Mar 10 15:29:48 crc kubenswrapper[4743]: I0310 15:29:48.362146 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4rs44"] Mar 10 15:29:48 crc kubenswrapper[4743]: W0310 15:29:48.389940 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b077b1c_05fd_4ef3_8adb_42ff870116c9.slice/crio-6a2201398775d20dac02e8655880032b77f79647079a6d7af346ef430362a04e WatchSource:0}: Error finding container 6a2201398775d20dac02e8655880032b77f79647079a6d7af346ef430362a04e: Status 404 returned error can't find the container with id 6a2201398775d20dac02e8655880032b77f79647079a6d7af346ef430362a04e Mar 10 15:29:49 crc kubenswrapper[4743]: I0310 15:29:49.065646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4rs44" event={"ID":"8b077b1c-05fd-4ef3-8adb-42ff870116c9","Type":"ContainerStarted","Data":"ac1e4cfb488b460578546001cf462b5ef2cf0dfd231f446088c2b77d097a0340"} Mar 10 15:29:49 crc kubenswrapper[4743]: I0310 15:29:49.065994 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4rs44" event={"ID":"8b077b1c-05fd-4ef3-8adb-42ff870116c9","Type":"ContainerStarted","Data":"6a2201398775d20dac02e8655880032b77f79647079a6d7af346ef430362a04e"} Mar 10 15:29:49 crc kubenswrapper[4743]: I0310 15:29:49.070279 4743 generic.go:334] "Generic (PLEG): container finished" podID="b5cf5878-917c-4aa7-b209-37045ef1dc13" containerID="52ec9b9398176cd1772b646c9f6973110da9f5838155696ccecc267acf1035e5" exitCode=0 Mar 10 15:29:49 crc kubenswrapper[4743]: I0310 15:29:49.071697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" event={"ID":"b5cf5878-917c-4aa7-b209-37045ef1dc13","Type":"ContainerDied","Data":"52ec9b9398176cd1772b646c9f6973110da9f5838155696ccecc267acf1035e5"} Mar 10 15:29:49 crc kubenswrapper[4743]: I0310 15:29:49.088387 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4rs44" podStartSLOduration=2.088365678 podStartE2EDuration="2.088365678s" podCreationTimestamp="2026-03-10 15:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:49.082885091 +0000 UTC m=+1453.789699849" watchObservedRunningTime="2026-03-10 15:29:49.088365678 +0000 UTC m=+1453.795180416" Mar 10 15:29:50 crc kubenswrapper[4743]: I0310 15:29:50.257440 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:50 crc kubenswrapper[4743]: I0310 15:29:50.271428 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.103392 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cfcb383-10fc-4149-af2f-fe50f3003f05","Type":"ContainerStarted","Data":"c477deaa1241d980859c751caba732c97a4904fb2a215eb37cfa9907fc5e0d1f"} Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.105938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ed08f17-0875-42b7-8764-d5589cf03e0b","Type":"ContainerStarted","Data":"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36"} Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.105996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ed08f17-0875-42b7-8764-d5589cf03e0b","Type":"ContainerStarted","Data":"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8"} Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.106110 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerName="nova-metadata-log" containerID="cri-o://338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8" gracePeriod=30 Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.106446 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerName="nova-metadata-metadata" containerID="cri-o://3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36" gracePeriod=30 Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.116769 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" event={"ID":"b5cf5878-917c-4aa7-b209-37045ef1dc13","Type":"ContainerStarted","Data":"2ee6c40424e960f0b7645ad5e92e770a52224861aefdedb07c9a0a706ff062f5"} Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.116965 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.121391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6361004a-5259-434e-a899-81704ebee56c","Type":"ContainerStarted","Data":"b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1"} Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.121448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6361004a-5259-434e-a899-81704ebee56c","Type":"ContainerStarted","Data":"1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15"} Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.124867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"98838081-e32b-46ae-b757-72abda3f9737","Type":"ContainerStarted","Data":"d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7"} Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.124978 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="98838081-e32b-46ae-b757-72abda3f9737" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7" gracePeriod=30 Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.161752 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" podStartSLOduration=6.1617285410000004 podStartE2EDuration="6.161728541s" podCreationTimestamp="2026-03-10 15:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:52.15925341 +0000 UTC m=+1456.866068158" watchObservedRunningTime="2026-03-10 15:29:52.161728541 +0000 UTC m=+1456.868543289" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.162703 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6431434080000002 podStartE2EDuration="6.162697229s" podCreationTimestamp="2026-03-10 15:29:46 +0000 UTC" firstStartedPulling="2026-03-10 15:29:47.392316216 +0000 UTC m=+1452.099130964" lastFinishedPulling="2026-03-10 15:29:50.911870037 +0000 UTC m=+1455.618684785" observedRunningTime="2026-03-10 15:29:52.138334902 +0000 UTC m=+1456.845149660" watchObservedRunningTime="2026-03-10 15:29:52.162697229 +0000 UTC m=+1456.869511967" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.184320 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.955922058 podStartE2EDuration="6.184293087s" podCreationTimestamp="2026-03-10 15:29:46 +0000 UTC" firstStartedPulling="2026-03-10 15:29:47.682451908 +0000 UTC m=+1452.389266656" lastFinishedPulling="2026-03-10 15:29:50.910822937 +0000 UTC m=+1455.617637685" observedRunningTime="2026-03-10 15:29:52.178617134 +0000 UTC m=+1456.885431882" watchObservedRunningTime="2026-03-10 15:29:52.184293087 +0000 UTC m=+1456.891107835" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.209892 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.85326261 podStartE2EDuration="6.209839438s" podCreationTimestamp="2026-03-10 15:29:46 +0000 UTC" firstStartedPulling="2026-03-10 15:29:47.553059905 +0000 UTC m=+1452.259874653" lastFinishedPulling="2026-03-10 15:29:50.909636743 +0000 UTC m=+1455.616451481" observedRunningTime="2026-03-10 15:29:52.199757589 +0000 UTC m=+1456.906572337" watchObservedRunningTime="2026-03-10 15:29:52.209839438 +0000 UTC m=+1456.916654196" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.234555 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.1851448270000002 podStartE2EDuration="6.234527334s" podCreationTimestamp="2026-03-10 15:29:46 +0000 UTC" firstStartedPulling="2026-03-10 15:29:47.869662255 +0000 UTC m=+1452.576477003" lastFinishedPulling="2026-03-10 15:29:50.919044762 +0000 UTC m=+1455.625859510" observedRunningTime="2026-03-10 15:29:52.218239848 +0000 UTC m=+1456.925054596" watchObservedRunningTime="2026-03-10 15:29:52.234527334 +0000 UTC m=+1456.941342082" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.735799 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.769563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-combined-ca-bundle\") pod \"3ed08f17-0875-42b7-8764-d5589cf03e0b\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.769681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-config-data\") pod \"3ed08f17-0875-42b7-8764-d5589cf03e0b\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.769762 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh59j\" (UniqueName: \"kubernetes.io/projected/3ed08f17-0875-42b7-8764-d5589cf03e0b-kube-api-access-nh59j\") pod \"3ed08f17-0875-42b7-8764-d5589cf03e0b\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.769798 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ed08f17-0875-42b7-8764-d5589cf03e0b-logs\") pod \"3ed08f17-0875-42b7-8764-d5589cf03e0b\" (UID: \"3ed08f17-0875-42b7-8764-d5589cf03e0b\") " Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.770308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed08f17-0875-42b7-8764-d5589cf03e0b-logs" (OuterVolumeSpecName: "logs") pod "3ed08f17-0875-42b7-8764-d5589cf03e0b" (UID: "3ed08f17-0875-42b7-8764-d5589cf03e0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.776036 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed08f17-0875-42b7-8764-d5589cf03e0b-kube-api-access-nh59j" (OuterVolumeSpecName: "kube-api-access-nh59j") pod "3ed08f17-0875-42b7-8764-d5589cf03e0b" (UID: "3ed08f17-0875-42b7-8764-d5589cf03e0b"). InnerVolumeSpecName "kube-api-access-nh59j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.807216 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-config-data" (OuterVolumeSpecName: "config-data") pod "3ed08f17-0875-42b7-8764-d5589cf03e0b" (UID: "3ed08f17-0875-42b7-8764-d5589cf03e0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.809935 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ed08f17-0875-42b7-8764-d5589cf03e0b" (UID: "3ed08f17-0875-42b7-8764-d5589cf03e0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.871872 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.871931 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh59j\" (UniqueName: \"kubernetes.io/projected/3ed08f17-0875-42b7-8764-d5589cf03e0b-kube-api-access-nh59j\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.871952 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ed08f17-0875-42b7-8764-d5589cf03e0b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:52 crc kubenswrapper[4743]: I0310 15:29:52.871963 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed08f17-0875-42b7-8764-d5589cf03e0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.137252 4743 generic.go:334] "Generic (PLEG): container finished" podID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerID="3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36" exitCode=0 Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.137291 4743 generic.go:334] "Generic (PLEG): container finished" podID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerID="338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8" exitCode=143 Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.137378 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.137440 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ed08f17-0875-42b7-8764-d5589cf03e0b","Type":"ContainerDied","Data":"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36"} Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.137486 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ed08f17-0875-42b7-8764-d5589cf03e0b","Type":"ContainerDied","Data":"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8"} Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.137502 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ed08f17-0875-42b7-8764-d5589cf03e0b","Type":"ContainerDied","Data":"e8ac037bb98ae007441d888ed9c63c6a731f5fad7b044e6d3f3999ad04095526"} Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.137525 4743 scope.go:117] "RemoveContainer" containerID="3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.165090 4743 scope.go:117] "RemoveContainer" containerID="338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.192500 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.215692 4743 scope.go:117] "RemoveContainer" containerID="3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36" Mar 10 15:29:53 crc kubenswrapper[4743]: E0310 15:29:53.216241 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36\": container with ID starting with 3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36 not found: ID does not exist" containerID="3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.216271 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36"} err="failed to get container status \"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36\": rpc error: code = NotFound desc = could not find container \"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36\": container with ID starting with 3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36 not found: ID does not exist" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.216294 4743 scope.go:117] "RemoveContainer" containerID="338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8" Mar 10 15:29:53 crc kubenswrapper[4743]: E0310 15:29:53.216529 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8\": container with ID starting with 338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8 not found: ID does not exist" containerID="338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.216568 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8"} err="failed to get container status \"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8\": rpc error: code = NotFound desc = could not find container \"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8\": container with ID starting with 338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8 not found: ID does not exist" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.216582 4743 scope.go:117] "RemoveContainer" containerID="3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.217424 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.217640 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36"} err="failed to get container status \"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36\": rpc error: code = NotFound desc = could not find container \"3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36\": container with ID starting with 3ce74f9ef413a89d5ff174f524458f05b6cf221fa9ade6b8b72fd77f0e0f6d36 not found: ID does not exist" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.217668 4743 scope.go:117] "RemoveContainer" containerID="338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.217934 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8"} err="failed to get container status \"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8\": rpc error: code = NotFound desc = could not find container \"338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8\": container with ID starting with 338c9d492c38a9ad651f829ae3382cb0c242849471f1968462c63207181895a8 not found: ID does not exist" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.229036 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:53 crc kubenswrapper[4743]: E0310 15:29:53.229695 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerName="nova-metadata-log" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.229720 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerName="nova-metadata-log" Mar 10 15:29:53 crc kubenswrapper[4743]: E0310 15:29:53.229756 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerName="nova-metadata-metadata" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.229763 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerName="nova-metadata-metadata" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.229989 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerName="nova-metadata-metadata" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.230036 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" containerName="nova-metadata-log" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.231242 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.239282 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.239480 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.250774 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.280875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z54l\" (UniqueName: \"kubernetes.io/projected/ecec10a1-dfaf-4c61-a186-8a8dace31806-kube-api-access-2z54l\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.281033 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.281085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecec10a1-dfaf-4c61-a186-8a8dace31806-logs\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.281171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.281223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-config-data\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.382300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-config-data\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.382733 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z54l\" (UniqueName: \"kubernetes.io/projected/ecec10a1-dfaf-4c61-a186-8a8dace31806-kube-api-access-2z54l\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.382860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.382897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecec10a1-dfaf-4c61-a186-8a8dace31806-logs\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.382927 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.383380 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecec10a1-dfaf-4c61-a186-8a8dace31806-logs\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.387337 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.395332 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.406146 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.406410 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="26e1af4e-d1d8-4e2f-b5ca-917dfca9906c" containerName="kube-state-metrics" containerID="cri-o://c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd" gracePeriod=30 Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.406470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-config-data\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.417791 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z54l\" (UniqueName: \"kubernetes.io/projected/ecec10a1-dfaf-4c61-a186-8a8dace31806-kube-api-access-2z54l\") pod \"nova-metadata-0\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.557582 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.977746 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed08f17-0875-42b7-8764-d5589cf03e0b" path="/var/lib/kubelet/pods/3ed08f17-0875-42b7-8764-d5589cf03e0b/volumes" Mar 10 15:29:53 crc kubenswrapper[4743]: I0310 15:29:53.997378 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.109895 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pghbz\" (UniqueName: \"kubernetes.io/projected/26e1af4e-d1d8-4e2f-b5ca-917dfca9906c-kube-api-access-pghbz\") pod \"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c\" (UID: \"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c\") " Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.155144 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e1af4e-d1d8-4e2f-b5ca-917dfca9906c-kube-api-access-pghbz" (OuterVolumeSpecName: "kube-api-access-pghbz") pod "26e1af4e-d1d8-4e2f-b5ca-917dfca9906c" (UID: "26e1af4e-d1d8-4e2f-b5ca-917dfca9906c"). InnerVolumeSpecName "kube-api-access-pghbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.194290 4743 generic.go:334] "Generic (PLEG): container finished" podID="26e1af4e-d1d8-4e2f-b5ca-917dfca9906c" containerID="c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd" exitCode=2 Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.194352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c","Type":"ContainerDied","Data":"c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd"} Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.194382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"26e1af4e-d1d8-4e2f-b5ca-917dfca9906c","Type":"ContainerDied","Data":"c9578c10055819a51ad2192fbdb2aa312c1e09a37fa8084681cb02dd770ae8a5"} Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.194399 4743 scope.go:117] "RemoveContainer" containerID="c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.194526 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.214024 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pghbz\" (UniqueName: \"kubernetes.io/projected/26e1af4e-d1d8-4e2f-b5ca-917dfca9906c-kube-api-access-pghbz\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.234549 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.260454 4743 scope.go:117] "RemoveContainer" containerID="c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd" Mar 10 15:29:54 crc kubenswrapper[4743]: E0310 15:29:54.262041 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd\": container with ID starting with c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd not found: ID does not exist" containerID="c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.262092 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd"} err="failed to get container status \"c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd\": rpc error: code = NotFound desc = could not find container \"c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd\": container with ID starting with c22ecac82250681ad92220d0bc13e72e9dd0aca09a8159bbfd3957b861a275cd not found: ID does not exist" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.263695 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.289647 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.322818 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:29:54 crc kubenswrapper[4743]: E0310 15:29:54.323412 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e1af4e-d1d8-4e2f-b5ca-917dfca9906c" containerName="kube-state-metrics" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.323429 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e1af4e-d1d8-4e2f-b5ca-917dfca9906c" containerName="kube-state-metrics" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.323647 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e1af4e-d1d8-4e2f-b5ca-917dfca9906c" containerName="kube-state-metrics" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.325881 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.329253 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.329288 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.374314 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.423488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.423821 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njgrh\" (UniqueName: \"kubernetes.io/projected/820c4321-0bbe-4413-bf70-80da56a68366-kube-api-access-njgrh\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.423875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.423971 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.526171 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.526229 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njgrh\" (UniqueName: \"kubernetes.io/projected/820c4321-0bbe-4413-bf70-80da56a68366-kube-api-access-njgrh\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.526251 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.526330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.531784 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.532664 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.545199 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/820c4321-0bbe-4413-bf70-80da56a68366-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.553853 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njgrh\" (UniqueName: \"kubernetes.io/projected/820c4321-0bbe-4413-bf70-80da56a68366-kube-api-access-njgrh\") pod \"kube-state-metrics-0\" (UID: \"820c4321-0bbe-4413-bf70-80da56a68366\") " pod="openstack/kube-state-metrics-0" Mar 10 15:29:54 crc kubenswrapper[4743]: I0310 15:29:54.649667 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.205204 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.229997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"820c4321-0bbe-4413-bf70-80da56a68366","Type":"ContainerStarted","Data":"6220ee549a5f1eb5597fc505fcaea866dbaa83eb898064c035ff9fc5835871bc"} Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.233410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecec10a1-dfaf-4c61-a186-8a8dace31806","Type":"ContainerStarted","Data":"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee"} Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.233475 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecec10a1-dfaf-4c61-a186-8a8dace31806","Type":"ContainerStarted","Data":"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295"} Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.233489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecec10a1-dfaf-4c61-a186-8a8dace31806","Type":"ContainerStarted","Data":"b961d40ee6a04a8a41b67ef10eec4b860c7faa96b002d8e3d8e74bc73dc44242"} Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.272527 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.272503655 podStartE2EDuration="2.272503655s" podCreationTimestamp="2026-03-10 15:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:55.257528247 +0000 UTC m=+1459.964342995" watchObservedRunningTime="2026-03-10 15:29:55.272503655 +0000 UTC m=+1459.979318403" Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.946090 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e1af4e-d1d8-4e2f-b5ca-917dfca9906c" path="/var/lib/kubelet/pods/26e1af4e-d1d8-4e2f-b5ca-917dfca9906c/volumes" Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.968544 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.970105 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="ceilometer-central-agent" containerID="cri-o://4a15918033e02909941377b979e69147db358742340567004d2dec3ebbf02d5a" gracePeriod=30 Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.970204 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="sg-core" containerID="cri-o://1aca20ffe56168504c3fb352e4f749f0347b8838515c117b285546d7bd6dcb1e" gracePeriod=30 Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.970234 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="proxy-httpd" containerID="cri-o://3cc967ecf33e219e2e63415a97df2f76041124deffcd702d924189341f92de63" gracePeriod=30 Mar 10 15:29:55 crc kubenswrapper[4743]: I0310 15:29:55.970249 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="ceilometer-notification-agent" containerID="cri-o://a1961592dd52d3e0efe5ae8ab07de8d9689ab192d3e5fdc7c0e020a1c8a10958" gracePeriod=30 Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.246273 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerID="1aca20ffe56168504c3fb352e4f749f0347b8838515c117b285546d7bd6dcb1e" exitCode=2 Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.246358 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerDied","Data":"1aca20ffe56168504c3fb352e4f749f0347b8838515c117b285546d7bd6dcb1e"} Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.248008 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b077b1c-05fd-4ef3-8adb-42ff870116c9" containerID="ac1e4cfb488b460578546001cf462b5ef2cf0dfd231f446088c2b77d097a0340" exitCode=0 Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.248079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4rs44" event={"ID":"8b077b1c-05fd-4ef3-8adb-42ff870116c9","Type":"ContainerDied","Data":"ac1e4cfb488b460578546001cf462b5ef2cf0dfd231f446088c2b77d097a0340"} Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.249396 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"820c4321-0bbe-4413-bf70-80da56a68366","Type":"ContainerStarted","Data":"86c4a6e20841bb708f903a0a4a5b9abd1f3afc24b8e0523cae6e60e507f480dc"} Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.249525 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.284635 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8793741910000001 podStartE2EDuration="2.284619777s" podCreationTimestamp="2026-03-10 15:29:54 +0000 UTC" firstStartedPulling="2026-03-10 15:29:55.205742675 +0000 UTC m=+1459.912557423" lastFinishedPulling="2026-03-10 15:29:55.610988261 +0000 UTC m=+1460.317803009" observedRunningTime="2026-03-10 15:29:56.284092742 +0000 UTC m=+1460.990907490" watchObservedRunningTime="2026-03-10 15:29:56.284619777 +0000 UTC m=+1460.991434525" Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.506110 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.506183 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.532305 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 15:29:56 crc kubenswrapper[4743]: I0310 15:29:56.861434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.164130 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.165882 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.255135 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.264426 4743 generic.go:334] "Generic (PLEG): container finished" podID="67f18369-c26d-4389-a49c-e0178c8d6db3" containerID="3437e368284b6c57ea8906b6b04141fd77c8e95cea732ef59b968b2ee59909ac" exitCode=0 Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.264507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l2xjq" event={"ID":"67f18369-c26d-4389-a49c-e0178c8d6db3","Type":"ContainerDied","Data":"3437e368284b6c57ea8906b6b04141fd77c8e95cea732ef59b968b2ee59909ac"} Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.274145 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerID="3cc967ecf33e219e2e63415a97df2f76041124deffcd702d924189341f92de63" exitCode=0 Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.274187 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerID="4a15918033e02909941377b979e69147db358742340567004d2dec3ebbf02d5a" exitCode=0 Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.274319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerDied","Data":"3cc967ecf33e219e2e63415a97df2f76041124deffcd702d924189341f92de63"} Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.274396 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerDied","Data":"4a15918033e02909941377b979e69147db358742340567004d2dec3ebbf02d5a"} Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.360010 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-hgqrq"] Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.363527 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" podUID="593624b1-1f23-4fdb-8b94-00837da810bc" containerName="dnsmasq-dns" containerID="cri-o://2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7" gracePeriod=10 Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.364170 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 15:29:57 crc kubenswrapper[4743]: E0310 15:29:57.636886 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod593624b1_1f23_4fdb_8b94_00837da810bc.slice/crio-conmon-2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.834721 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.924671 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-config-data\") pod \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.924754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dm5w\" (UniqueName: \"kubernetes.io/projected/8b077b1c-05fd-4ef3-8adb-42ff870116c9-kube-api-access-9dm5w\") pod \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.924966 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-scripts\") pod \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.924991 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-combined-ca-bundle\") pod \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\" (UID: \"8b077b1c-05fd-4ef3-8adb-42ff870116c9\") " Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.938682 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b077b1c-05fd-4ef3-8adb-42ff870116c9-kube-api-access-9dm5w" (OuterVolumeSpecName: "kube-api-access-9dm5w") pod "8b077b1c-05fd-4ef3-8adb-42ff870116c9" (UID: "8b077b1c-05fd-4ef3-8adb-42ff870116c9"). InnerVolumeSpecName "kube-api-access-9dm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.943971 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-scripts" (OuterVolumeSpecName: "scripts") pod "8b077b1c-05fd-4ef3-8adb-42ff870116c9" (UID: "8b077b1c-05fd-4ef3-8adb-42ff870116c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.960696 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b077b1c-05fd-4ef3-8adb-42ff870116c9" (UID: "8b077b1c-05fd-4ef3-8adb-42ff870116c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:57 crc kubenswrapper[4743]: I0310 15:29:57.969456 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-config-data" (OuterVolumeSpecName: "config-data") pod "8b077b1c-05fd-4ef3-8adb-42ff870116c9" (UID: "8b077b1c-05fd-4ef3-8adb-42ff870116c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.013700 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.027242 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.028051 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dm5w\" (UniqueName: \"kubernetes.io/projected/8b077b1c-05fd-4ef3-8adb-42ff870116c9-kube-api-access-9dm5w\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.028145 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.028197 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b077b1c-05fd-4ef3-8adb-42ff870116c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.129701 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbt6r\" (UniqueName: \"kubernetes.io/projected/593624b1-1f23-4fdb-8b94-00837da810bc-kube-api-access-fbt6r\") pod \"593624b1-1f23-4fdb-8b94-00837da810bc\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.130003 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-config\") pod \"593624b1-1f23-4fdb-8b94-00837da810bc\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.130135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-nb\") pod \"593624b1-1f23-4fdb-8b94-00837da810bc\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.130285 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-svc\") pod \"593624b1-1f23-4fdb-8b94-00837da810bc\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.130423 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-sb\") pod \"593624b1-1f23-4fdb-8b94-00837da810bc\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.130570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-swift-storage-0\") pod \"593624b1-1f23-4fdb-8b94-00837da810bc\" (UID: \"593624b1-1f23-4fdb-8b94-00837da810bc\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.134013 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593624b1-1f23-4fdb-8b94-00837da810bc-kube-api-access-fbt6r" (OuterVolumeSpecName: "kube-api-access-fbt6r") pod "593624b1-1f23-4fdb-8b94-00837da810bc" (UID: "593624b1-1f23-4fdb-8b94-00837da810bc"). InnerVolumeSpecName "kube-api-access-fbt6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.181871 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "593624b1-1f23-4fdb-8b94-00837da810bc" (UID: "593624b1-1f23-4fdb-8b94-00837da810bc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.191973 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "593624b1-1f23-4fdb-8b94-00837da810bc" (UID: "593624b1-1f23-4fdb-8b94-00837da810bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.193217 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "593624b1-1f23-4fdb-8b94-00837da810bc" (UID: "593624b1-1f23-4fdb-8b94-00837da810bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.196416 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "593624b1-1f23-4fdb-8b94-00837da810bc" (UID: "593624b1-1f23-4fdb-8b94-00837da810bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.201079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-config" (OuterVolumeSpecName: "config") pod "593624b1-1f23-4fdb-8b94-00837da810bc" (UID: "593624b1-1f23-4fdb-8b94-00837da810bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.233103 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.233142 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbt6r\" (UniqueName: \"kubernetes.io/projected/593624b1-1f23-4fdb-8b94-00837da810bc-kube-api-access-fbt6r\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.233154 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.233163 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.233172 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.233179 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593624b1-1f23-4fdb-8b94-00837da810bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.249065 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.249430 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.283469 4743 generic.go:334] "Generic (PLEG): container finished" podID="593624b1-1f23-4fdb-8b94-00837da810bc" containerID="2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7" exitCode=0 Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.283540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" event={"ID":"593624b1-1f23-4fdb-8b94-00837da810bc","Type":"ContainerDied","Data":"2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7"} Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.283572 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" event={"ID":"593624b1-1f23-4fdb-8b94-00837da810bc","Type":"ContainerDied","Data":"81f5e97c4992dcf4e8739c11b9aa9d88fd244b45144544f6bea1c21957f8c3a5"} Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.283595 4743 scope.go:117] "RemoveContainer" containerID="2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.283744 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d6d889f-hgqrq" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.291330 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4rs44" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.297026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4rs44" event={"ID":"8b077b1c-05fd-4ef3-8adb-42ff870116c9","Type":"ContainerDied","Data":"6a2201398775d20dac02e8655880032b77f79647079a6d7af346ef430362a04e"} Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.297114 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a2201398775d20dac02e8655880032b77f79647079a6d7af346ef430362a04e" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.348208 4743 scope.go:117] "RemoveContainer" containerID="ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.384515 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 15:29:58 crc kubenswrapper[4743]: E0310 15:29:58.385100 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593624b1-1f23-4fdb-8b94-00837da810bc" containerName="init" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.385121 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="593624b1-1f23-4fdb-8b94-00837da810bc" containerName="init" Mar 10 15:29:58 crc kubenswrapper[4743]: E0310 15:29:58.385161 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593624b1-1f23-4fdb-8b94-00837da810bc" containerName="dnsmasq-dns" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.385184 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="593624b1-1f23-4fdb-8b94-00837da810bc" containerName="dnsmasq-dns" Mar 10 15:29:58 crc kubenswrapper[4743]: E0310 15:29:58.385202 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b077b1c-05fd-4ef3-8adb-42ff870116c9" containerName="nova-cell1-conductor-db-sync" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.385210 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b077b1c-05fd-4ef3-8adb-42ff870116c9" containerName="nova-cell1-conductor-db-sync" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.385453 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b077b1c-05fd-4ef3-8adb-42ff870116c9" containerName="nova-cell1-conductor-db-sync" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.385473 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="593624b1-1f23-4fdb-8b94-00837da810bc" containerName="dnsmasq-dns" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.386512 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.407201 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.407973 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-hgqrq"] Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.408025 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-hgqrq"] Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.425608 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.446685 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9wq2\" (UniqueName: \"kubernetes.io/projected/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-kube-api-access-k9wq2\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.447093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.447296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.463340 4743 scope.go:117] "RemoveContainer" containerID="2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7" Mar 10 15:29:58 crc kubenswrapper[4743]: E0310 15:29:58.463751 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7\": container with ID starting with 2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7 not found: ID does not exist" containerID="2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.463790 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7"} err="failed to get container status \"2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7\": rpc error: code = NotFound desc = could not find container \"2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7\": container with ID starting with 2e28bf4b1e92a963365bd01ad891e49d315411c0c598f40a3d3ae1076942c6e7 not found: ID does not exist" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.463837 4743 scope.go:117] "RemoveContainer" containerID="ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86" Mar 10 15:29:58 crc kubenswrapper[4743]: E0310 15:29:58.464309 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86\": container with ID starting with ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86 not found: ID does not exist" containerID="ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.464503 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86"} err="failed to get container status \"ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86\": rpc error: code = NotFound desc = could not find container \"ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86\": container with ID starting with ea5762b939af7f0b3d2da4a3ed05e8716a63121d162453a9ace2845f6dcb5b86 not found: ID does not exist" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.560999 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.562191 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.563442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9wq2\" (UniqueName: \"kubernetes.io/projected/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-kube-api-access-k9wq2\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.563666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.563793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.576551 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.576567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.584834 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9wq2\" (UniqueName: \"kubernetes.io/projected/05ebbfdb-6c63-43a0-bd96-7e0adcb97221-kube-api-access-k9wq2\") pod \"nova-cell1-conductor-0\" (UID: \"05ebbfdb-6c63-43a0-bd96-7e0adcb97221\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.731866 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.777106 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.871103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cbx6\" (UniqueName: \"kubernetes.io/projected/67f18369-c26d-4389-a49c-e0178c8d6db3-kube-api-access-5cbx6\") pod \"67f18369-c26d-4389-a49c-e0178c8d6db3\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.872909 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-config-data\") pod \"67f18369-c26d-4389-a49c-e0178c8d6db3\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.873050 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-scripts\") pod \"67f18369-c26d-4389-a49c-e0178c8d6db3\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.873132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-combined-ca-bundle\") pod \"67f18369-c26d-4389-a49c-e0178c8d6db3\" (UID: \"67f18369-c26d-4389-a49c-e0178c8d6db3\") " Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.876537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f18369-c26d-4389-a49c-e0178c8d6db3-kube-api-access-5cbx6" (OuterVolumeSpecName: "kube-api-access-5cbx6") pod "67f18369-c26d-4389-a49c-e0178c8d6db3" (UID: "67f18369-c26d-4389-a49c-e0178c8d6db3"). InnerVolumeSpecName "kube-api-access-5cbx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.881008 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cbx6\" (UniqueName: \"kubernetes.io/projected/67f18369-c26d-4389-a49c-e0178c8d6db3-kube-api-access-5cbx6\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.881221 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-scripts" (OuterVolumeSpecName: "scripts") pod "67f18369-c26d-4389-a49c-e0178c8d6db3" (UID: "67f18369-c26d-4389-a49c-e0178c8d6db3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.923968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67f18369-c26d-4389-a49c-e0178c8d6db3" (UID: "67f18369-c26d-4389-a49c-e0178c8d6db3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.929279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-config-data" (OuterVolumeSpecName: "config-data") pod "67f18369-c26d-4389-a49c-e0178c8d6db3" (UID: "67f18369-c26d-4389-a49c-e0178c8d6db3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.991957 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.992716 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4743]: I0310 15:29:58.992751 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f18369-c26d-4389-a49c-e0178c8d6db3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.189639 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 15:29:59 crc kubenswrapper[4743]: W0310 15:29:59.197348 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05ebbfdb_6c63_43a0_bd96_7e0adcb97221.slice/crio-b90fbd72aca96230f10d7b8829b153f99988d95e05eea172c7d6c0aaeb7b1823 WatchSource:0}: Error finding container b90fbd72aca96230f10d7b8829b153f99988d95e05eea172c7d6c0aaeb7b1823: Status 404 returned error can't find the container with id b90fbd72aca96230f10d7b8829b153f99988d95e05eea172c7d6c0aaeb7b1823 Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.306690 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05ebbfdb-6c63-43a0-bd96-7e0adcb97221","Type":"ContainerStarted","Data":"b90fbd72aca96230f10d7b8829b153f99988d95e05eea172c7d6c0aaeb7b1823"} Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.308308 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-l2xjq" Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.308394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-l2xjq" event={"ID":"67f18369-c26d-4389-a49c-e0178c8d6db3","Type":"ContainerDied","Data":"f28d9e5ad44843928f4c9de5b2271b02aef8a1e7786c43f3f241f482079f7bd8"} Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.308446 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f28d9e5ad44843928f4c9de5b2271b02aef8a1e7786c43f3f241f482079f7bd8" Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.480374 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.480702 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-log" containerID="cri-o://1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15" gracePeriod=30 Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.481305 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-api" containerID="cri-o://b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1" gracePeriod=30 Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.490449 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.490663 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0cfcb383-10fc-4149-af2f-fe50f3003f05" containerName="nova-scheduler-scheduler" containerID="cri-o://c477deaa1241d980859c751caba732c97a4904fb2a215eb37cfa9907fc5e0d1f" gracePeriod=30 Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.507456 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:59 crc kubenswrapper[4743]: I0310 15:29:59.926349 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593624b1-1f23-4fdb-8b94-00837da810bc" path="/var/lib/kubelet/pods/593624b1-1f23-4fdb-8b94-00837da810bc/volumes" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.154429 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v"] Mar 10 15:30:00 crc kubenswrapper[4743]: E0310 15:30:00.155195 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f18369-c26d-4389-a49c-e0178c8d6db3" containerName="nova-manage" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.155211 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f18369-c26d-4389-a49c-e0178c8d6db3" containerName="nova-manage" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.155394 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f18369-c26d-4389-a49c-e0178c8d6db3" containerName="nova-manage" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.156061 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.158498 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.159016 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.168465 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552610-k7w8j"] Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.172035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552610-k7w8j" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.175057 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.175285 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.175410 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.181964 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v"] Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.193957 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552610-k7w8j"] Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.222365 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/677b6149-3bf9-45ee-938e-783742deb6dd-secret-volume\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.222453 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxgj2\" (UniqueName: \"kubernetes.io/projected/677b6149-3bf9-45ee-938e-783742deb6dd-kube-api-access-nxgj2\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.222669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/677b6149-3bf9-45ee-938e-783742deb6dd-config-volume\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.320523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05ebbfdb-6c63-43a0-bd96-7e0adcb97221","Type":"ContainerStarted","Data":"d156aa7c860f0ed20000b477e835e31870cb708d93722152a5b1bfad5d19595f"} Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.321045 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.325699 4743 generic.go:334] "Generic (PLEG): container finished" podID="6361004a-5259-434e-a899-81704ebee56c" containerID="1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15" exitCode=143 Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.325768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6361004a-5259-434e-a899-81704ebee56c","Type":"ContainerDied","Data":"1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15"} Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.326096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrztd\" (UniqueName: \"kubernetes.io/projected/7c8a28f7-44f1-4871-bff9-1d64242a7f5e-kube-api-access-jrztd\") pod \"auto-csr-approver-29552610-k7w8j\" (UID: \"7c8a28f7-44f1-4871-bff9-1d64242a7f5e\") " pod="openshift-infra/auto-csr-approver-29552610-k7w8j" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.326227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/677b6149-3bf9-45ee-938e-783742deb6dd-secret-volume\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.326292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxgj2\" (UniqueName: \"kubernetes.io/projected/677b6149-3bf9-45ee-938e-783742deb6dd-kube-api-access-nxgj2\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.326377 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/677b6149-3bf9-45ee-938e-783742deb6dd-config-volume\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.327701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/677b6149-3bf9-45ee-938e-783742deb6dd-config-volume\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.333282 4743 generic.go:334] "Generic (PLEG): container finished" podID="0cfcb383-10fc-4149-af2f-fe50f3003f05" containerID="c477deaa1241d980859c751caba732c97a4904fb2a215eb37cfa9907fc5e0d1f" exitCode=0 Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.333617 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerName="nova-metadata-log" containerID="cri-o://ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295" gracePeriod=30 Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.333687 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cfcb383-10fc-4149-af2f-fe50f3003f05","Type":"ContainerDied","Data":"c477deaa1241d980859c751caba732c97a4904fb2a215eb37cfa9907fc5e0d1f"} Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.333803 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerName="nova-metadata-metadata" containerID="cri-o://a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee" gracePeriod=30 Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.344319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/677b6149-3bf9-45ee-938e-783742deb6dd-secret-volume\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.346638 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxgj2\" (UniqueName: \"kubernetes.io/projected/677b6149-3bf9-45ee-938e-783742deb6dd-kube-api-access-nxgj2\") pod \"collect-profiles-29552610-7885v\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.352399 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.344309174 podStartE2EDuration="2.344309174s" podCreationTimestamp="2026-03-10 15:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:00.33858078 +0000 UTC m=+1465.045395518" watchObservedRunningTime="2026-03-10 15:30:00.344309174 +0000 UTC m=+1465.051123922" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.429308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrztd\" (UniqueName: \"kubernetes.io/projected/7c8a28f7-44f1-4871-bff9-1d64242a7f5e-kube-api-access-jrztd\") pod \"auto-csr-approver-29552610-k7w8j\" (UID: \"7c8a28f7-44f1-4871-bff9-1d64242a7f5e\") " pod="openshift-infra/auto-csr-approver-29552610-k7w8j" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.451495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrztd\" (UniqueName: \"kubernetes.io/projected/7c8a28f7-44f1-4871-bff9-1d64242a7f5e-kube-api-access-jrztd\") pod \"auto-csr-approver-29552610-k7w8j\" (UID: \"7c8a28f7-44f1-4871-bff9-1d64242a7f5e\") " pod="openshift-infra/auto-csr-approver-29552610-k7w8j" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.458196 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.478448 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.511469 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552610-k7w8j" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.531242 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-combined-ca-bundle\") pod \"0cfcb383-10fc-4149-af2f-fe50f3003f05\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.531460 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7zw7\" (UniqueName: \"kubernetes.io/projected/0cfcb383-10fc-4149-af2f-fe50f3003f05-kube-api-access-j7zw7\") pod \"0cfcb383-10fc-4149-af2f-fe50f3003f05\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.531561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-config-data\") pod \"0cfcb383-10fc-4149-af2f-fe50f3003f05\" (UID: \"0cfcb383-10fc-4149-af2f-fe50f3003f05\") " Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.539604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cfcb383-10fc-4149-af2f-fe50f3003f05-kube-api-access-j7zw7" (OuterVolumeSpecName: "kube-api-access-j7zw7") pod "0cfcb383-10fc-4149-af2f-fe50f3003f05" (UID: "0cfcb383-10fc-4149-af2f-fe50f3003f05"). InnerVolumeSpecName "kube-api-access-j7zw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.588578 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-config-data" (OuterVolumeSpecName: "config-data") pod "0cfcb383-10fc-4149-af2f-fe50f3003f05" (UID: "0cfcb383-10fc-4149-af2f-fe50f3003f05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.609702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cfcb383-10fc-4149-af2f-fe50f3003f05" (UID: "0cfcb383-10fc-4149-af2f-fe50f3003f05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.648680 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.648714 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7zw7\" (UniqueName: \"kubernetes.io/projected/0cfcb383-10fc-4149-af2f-fe50f3003f05-kube-api-access-j7zw7\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4743]: I0310 15:30:00.648730 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfcb383-10fc-4149-af2f-fe50f3003f05-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.043554 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v"] Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.220586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552610-k7w8j"] Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.224262 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.358462 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerID="a1961592dd52d3e0efe5ae8ab07de8d9689ab192d3e5fdc7c0e020a1c8a10958" exitCode=0 Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.358588 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerDied","Data":"a1961592dd52d3e0efe5ae8ab07de8d9689ab192d3e5fdc7c0e020a1c8a10958"} Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.361117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552610-k7w8j" event={"ID":"7c8a28f7-44f1-4871-bff9-1d64242a7f5e","Type":"ContainerStarted","Data":"64389d53dac5c949887d216d46d8ffa6983f26a858feb4b53ddaf58e1c76e958"} Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.362812 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" event={"ID":"677b6149-3bf9-45ee-938e-783742deb6dd","Type":"ContainerStarted","Data":"9188259f4dcb6189e963b475c9e9b42e1ee5208a091ed135bbfbe4b78460f2ac"} Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.362881 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" event={"ID":"677b6149-3bf9-45ee-938e-783742deb6dd","Type":"ContainerStarted","Data":"952761f32fa4a06da9656613332006f39e8ef403df12365757714bf40dad1462"} Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.366631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cfcb383-10fc-4149-af2f-fe50f3003f05","Type":"ContainerDied","Data":"daf51e774f504146b02ee543f5741cd5af1970cfcd8bf5c6035c5a971afba747"} Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.366697 4743 scope.go:117] "RemoveContainer" containerID="c477deaa1241d980859c751caba732c97a4904fb2a215eb37cfa9907fc5e0d1f" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.366889 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.371404 4743 generic.go:334] "Generic (PLEG): container finished" podID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerID="a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee" exitCode=0 Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.371434 4743 generic.go:334] "Generic (PLEG): container finished" podID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerID="ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295" exitCode=143 Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.371509 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.371565 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecec10a1-dfaf-4c61-a186-8a8dace31806","Type":"ContainerDied","Data":"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee"} Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.371608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecec10a1-dfaf-4c61-a186-8a8dace31806","Type":"ContainerDied","Data":"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295"} Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.371628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecec10a1-dfaf-4c61-a186-8a8dace31806","Type":"ContainerDied","Data":"b961d40ee6a04a8a41b67ef10eec4b860c7faa96b002d8e3d8e74bc73dc44242"} Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.372499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecec10a1-dfaf-4c61-a186-8a8dace31806-logs\") pod \"ecec10a1-dfaf-4c61-a186-8a8dace31806\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.372631 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-combined-ca-bundle\") pod \"ecec10a1-dfaf-4c61-a186-8a8dace31806\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.372800 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z54l\" (UniqueName: \"kubernetes.io/projected/ecec10a1-dfaf-4c61-a186-8a8dace31806-kube-api-access-2z54l\") pod \"ecec10a1-dfaf-4c61-a186-8a8dace31806\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.372861 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-config-data\") pod \"ecec10a1-dfaf-4c61-a186-8a8dace31806\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.372916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-nova-metadata-tls-certs\") pod \"ecec10a1-dfaf-4c61-a186-8a8dace31806\" (UID: \"ecec10a1-dfaf-4c61-a186-8a8dace31806\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.378153 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecec10a1-dfaf-4c61-a186-8a8dace31806-logs" (OuterVolumeSpecName: "logs") pod "ecec10a1-dfaf-4c61-a186-8a8dace31806" (UID: "ecec10a1-dfaf-4c61-a186-8a8dace31806"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.380229 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecec10a1-dfaf-4c61-a186-8a8dace31806-kube-api-access-2z54l" (OuterVolumeSpecName: "kube-api-access-2z54l") pod "ecec10a1-dfaf-4c61-a186-8a8dace31806" (UID: "ecec10a1-dfaf-4c61-a186-8a8dace31806"). InnerVolumeSpecName "kube-api-access-2z54l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.401123 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" podStartSLOduration=1.401097664 podStartE2EDuration="1.401097664s" podCreationTimestamp="2026-03-10 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:01.383798899 +0000 UTC m=+1466.090613667" watchObservedRunningTime="2026-03-10 15:30:01.401097664 +0000 UTC m=+1466.107912432" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.412011 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-config-data" (OuterVolumeSpecName: "config-data") pod "ecec10a1-dfaf-4c61-a186-8a8dace31806" (UID: "ecec10a1-dfaf-4c61-a186-8a8dace31806"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.416930 4743 scope.go:117] "RemoveContainer" containerID="a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.459907 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.475538 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.475581 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecec10a1-dfaf-4c61-a186-8a8dace31806-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.475592 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z54l\" (UniqueName: \"kubernetes.io/projected/ecec10a1-dfaf-4c61-a186-8a8dace31806-kube-api-access-2z54l\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.488855 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.505743 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.506308 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerName="nova-metadata-log" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.506335 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerName="nova-metadata-log" Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.506365 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfcb383-10fc-4149-af2f-fe50f3003f05" containerName="nova-scheduler-scheduler" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.506374 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfcb383-10fc-4149-af2f-fe50f3003f05" containerName="nova-scheduler-scheduler" Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.506411 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerName="nova-metadata-metadata" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.506419 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerName="nova-metadata-metadata" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.506643 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cfcb383-10fc-4149-af2f-fe50f3003f05" containerName="nova-scheduler-scheduler" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.506685 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerName="nova-metadata-log" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.506699 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" containerName="nova-metadata-metadata" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.507554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.517968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecec10a1-dfaf-4c61-a186-8a8dace31806" (UID: "ecec10a1-dfaf-4c61-a186-8a8dace31806"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.524567 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.525027 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ecec10a1-dfaf-4c61-a186-8a8dace31806" (UID: "ecec10a1-dfaf-4c61-a186-8a8dace31806"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.531676 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.531865 4743 scope.go:117] "RemoveContainer" containerID="ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.557031 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.584692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5glkv\" (UniqueName: \"kubernetes.io/projected/46f01fb8-754a-4367-9a23-e85cdd0e44d5-kube-api-access-5glkv\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.584860 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-config-data\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.584892 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.584985 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.584997 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecec10a1-dfaf-4c61-a186-8a8dace31806-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.641377 4743 scope.go:117] "RemoveContainer" containerID="a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee" Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.649486 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee\": container with ID starting with a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee not found: ID does not exist" containerID="a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.649536 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee"} err="failed to get container status \"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee\": rpc error: code = NotFound desc = could not find container \"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee\": container with ID starting with a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee not found: ID does not exist" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.649570 4743 scope.go:117] "RemoveContainer" containerID="ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295" Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.659985 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295\": container with ID starting with ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295 not found: ID does not exist" containerID="ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.660035 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295"} err="failed to get container status \"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295\": rpc error: code = NotFound desc = could not find container \"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295\": container with ID starting with ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295 not found: ID does not exist" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.660066 4743 scope.go:117] "RemoveContainer" containerID="a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.681269 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee"} err="failed to get container status \"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee\": rpc error: code = NotFound desc = could not find container \"a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee\": container with ID starting with a08651909f164c0148912ac7aa9f46f8febecac70a7e0e8ae999fb04ca3aabee not found: ID does not exist" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.681322 4743 scope.go:117] "RemoveContainer" containerID="ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.683114 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295"} err="failed to get container status \"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295\": rpc error: code = NotFound desc = could not find container \"ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295\": container with ID starting with ab7a68b768c13d096436f06db22ce1a5ca0aee67f8bfe4dce05131dd51ccf295 not found: ID does not exist" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.685769 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-config-data\") pod \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.685876 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-run-httpd\") pod \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.685914 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-log-httpd\") pod \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.686030 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m8sm\" (UniqueName: \"kubernetes.io/projected/d4ce59fe-b30d-42e1-a49f-5108fec386c9-kube-api-access-4m8sm\") pod \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.686076 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-scripts\") pod \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.686134 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-combined-ca-bundle\") pod \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.686198 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-sg-core-conf-yaml\") pod \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\" (UID: \"d4ce59fe-b30d-42e1-a49f-5108fec386c9\") " Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.686664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-config-data\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.686708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.686876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5glkv\" (UniqueName: \"kubernetes.io/projected/46f01fb8-754a-4367-9a23-e85cdd0e44d5-kube-api-access-5glkv\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.707703 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d4ce59fe-b30d-42e1-a49f-5108fec386c9" (UID: "d4ce59fe-b30d-42e1-a49f-5108fec386c9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.713741 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d4ce59fe-b30d-42e1-a49f-5108fec386c9" (UID: "d4ce59fe-b30d-42e1-a49f-5108fec386c9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.715740 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-scripts" (OuterVolumeSpecName: "scripts") pod "d4ce59fe-b30d-42e1-a49f-5108fec386c9" (UID: "d4ce59fe-b30d-42e1-a49f-5108fec386c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.718291 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-config-data\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.727443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.744400 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5glkv\" (UniqueName: \"kubernetes.io/projected/46f01fb8-754a-4367-9a23-e85cdd0e44d5-kube-api-access-5glkv\") pod \"nova-scheduler-0\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.762908 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ce59fe-b30d-42e1-a49f-5108fec386c9-kube-api-access-4m8sm" (OuterVolumeSpecName: "kube-api-access-4m8sm") pod "d4ce59fe-b30d-42e1-a49f-5108fec386c9" (UID: "d4ce59fe-b30d-42e1-a49f-5108fec386c9"). InnerVolumeSpecName "kube-api-access-4m8sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.788943 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.788969 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ce59fe-b30d-42e1-a49f-5108fec386c9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.788979 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m8sm\" (UniqueName: \"kubernetes.io/projected/d4ce59fe-b30d-42e1-a49f-5108fec386c9-kube-api-access-4m8sm\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.788991 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.791036 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d4ce59fe-b30d-42e1-a49f-5108fec386c9" (UID: "d4ce59fe-b30d-42e1-a49f-5108fec386c9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.821204 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4ce59fe-b30d-42e1-a49f-5108fec386c9" (UID: "d4ce59fe-b30d-42e1-a49f-5108fec386c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.846976 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-config-data" (OuterVolumeSpecName: "config-data") pod "d4ce59fe-b30d-42e1-a49f-5108fec386c9" (UID: "d4ce59fe-b30d-42e1-a49f-5108fec386c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.890617 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.890880 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.890970 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ce59fe-b30d-42e1-a49f-5108fec386c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.896416 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.905594 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.925453 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cfcb383-10fc-4149-af2f-fe50f3003f05" path="/var/lib/kubelet/pods/0cfcb383-10fc-4149-af2f-fe50f3003f05/volumes" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.926066 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecec10a1-dfaf-4c61-a186-8a8dace31806" path="/var/lib/kubelet/pods/ecec10a1-dfaf-4c61-a186-8a8dace31806/volumes" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.926849 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.927170 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="ceilometer-central-agent" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.927188 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="ceilometer-central-agent" Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.927209 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="proxy-httpd" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.927215 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="proxy-httpd" Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.927245 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="ceilometer-notification-agent" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.927251 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="ceilometer-notification-agent" Mar 10 15:30:01 crc kubenswrapper[4743]: E0310 15:30:01.927266 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="sg-core" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.927274 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="sg-core" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.927444 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="ceilometer-central-agent" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.927457 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="ceilometer-notification-agent" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.927473 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="proxy-httpd" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.927486 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" containerName="sg-core" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.929174 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.929286 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.934873 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.935263 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.953402 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.993310 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.993395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wxw\" (UniqueName: \"kubernetes.io/projected/e33770af-c94d-4a80-97d0-412c0f3605d6-kube-api-access-g4wxw\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.993417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e33770af-c94d-4a80-97d0-412c0f3605d6-logs\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.993436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:01 crc kubenswrapper[4743]: I0310 15:30:01.993896 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-config-data\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.096077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-config-data\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.096372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.096430 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wxw\" (UniqueName: \"kubernetes.io/projected/e33770af-c94d-4a80-97d0-412c0f3605d6-kube-api-access-g4wxw\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.096451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e33770af-c94d-4a80-97d0-412c0f3605d6-logs\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.096473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.097423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e33770af-c94d-4a80-97d0-412c0f3605d6-logs\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.102437 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-config-data\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.103143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.103865 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.113230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wxw\" (UniqueName: \"kubernetes.io/projected/e33770af-c94d-4a80-97d0-412c0f3605d6-kube-api-access-g4wxw\") pod \"nova-metadata-0\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.263134 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.383813 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ce59fe-b30d-42e1-a49f-5108fec386c9","Type":"ContainerDied","Data":"29920f485f44ca3b2c8d042f9ce0c7807ffc11a117cda62d31cfe1ba0a9ae62c"} Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.383877 4743 scope.go:117] "RemoveContainer" containerID="3cc967ecf33e219e2e63415a97df2f76041124deffcd702d924189341f92de63" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.383920 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.397413 4743 generic.go:334] "Generic (PLEG): container finished" podID="677b6149-3bf9-45ee-938e-783742deb6dd" containerID="9188259f4dcb6189e963b475c9e9b42e1ee5208a091ed135bbfbe4b78460f2ac" exitCode=0 Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.397643 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" event={"ID":"677b6149-3bf9-45ee-938e-783742deb6dd","Type":"ContainerDied","Data":"9188259f4dcb6189e963b475c9e9b42e1ee5208a091ed135bbfbe4b78460f2ac"} Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.413764 4743 scope.go:117] "RemoveContainer" containerID="1aca20ffe56168504c3fb352e4f749f0347b8838515c117b285546d7bd6dcb1e" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.423928 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.453128 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.468538 4743 scope.go:117] "RemoveContainer" containerID="a1961592dd52d3e0efe5ae8ab07de8d9689ab192d3e5fdc7c0e020a1c8a10958" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.496621 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.514558 4743 scope.go:117] "RemoveContainer" containerID="4a15918033e02909941377b979e69147db358742340567004d2dec3ebbf02d5a" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.524493 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.527630 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.532565 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.532967 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.533229 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.561849 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.615397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.615519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-scripts\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.615588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29gn\" (UniqueName: \"kubernetes.io/projected/40027638-16c4-4a09-891f-7f1d3524d5f2-kube-api-access-j29gn\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.615625 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.615644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-config-data\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.615675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.615739 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-log-httpd\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.615788 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-run-httpd\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.719669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-config-data\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.720646 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.721065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.721133 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-log-httpd\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.721225 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-run-httpd\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.721348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.721476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-scripts\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.721581 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29gn\" (UniqueName: \"kubernetes.io/projected/40027638-16c4-4a09-891f-7f1d3524d5f2-kube-api-access-j29gn\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.721706 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-log-httpd\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.722425 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-run-httpd\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.725483 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.725484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.725650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.727850 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-scripts\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.740138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-config-data\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.750611 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29gn\" (UniqueName: \"kubernetes.io/projected/40027638-16c4-4a09-891f-7f1d3524d5f2-kube-api-access-j29gn\") pod \"ceilometer-0\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " pod="openstack/ceilometer-0" Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.768911 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:02 crc kubenswrapper[4743]: I0310 15:30:02.999278 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.417922 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e33770af-c94d-4a80-97d0-412c0f3605d6","Type":"ContainerStarted","Data":"d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a"} Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.418253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e33770af-c94d-4a80-97d0-412c0f3605d6","Type":"ContainerStarted","Data":"6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f"} Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.418269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e33770af-c94d-4a80-97d0-412c0f3605d6","Type":"ContainerStarted","Data":"bb78405fc8a436438074c37c1775cdf75a66f9ec1c311298fb037cfb83579329"} Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.422789 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46f01fb8-754a-4367-9a23-e85cdd0e44d5","Type":"ContainerStarted","Data":"f074304206d1fb638ec82bc4514026d9a9cf8ecc35e20ec48cbb6a855f822f85"} Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.422935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46f01fb8-754a-4367-9a23-e85cdd0e44d5","Type":"ContainerStarted","Data":"53aacd3a40a993386cda42690f6acb05d435291bbbc6022c46dd2d5045447eb2"} Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.440181 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.440164531 podStartE2EDuration="2.440164531s" podCreationTimestamp="2026-03-10 15:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:03.438825662 +0000 UTC m=+1468.145640410" watchObservedRunningTime="2026-03-10 15:30:03.440164531 +0000 UTC m=+1468.146979279" Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.480035 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.480013751 podStartE2EDuration="2.480013751s" podCreationTimestamp="2026-03-10 15:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:03.479731183 +0000 UTC m=+1468.186545951" watchObservedRunningTime="2026-03-10 15:30:03.480013751 +0000 UTC m=+1468.186828499" Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.523125 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:03 crc kubenswrapper[4743]: W0310 15:30:03.536747 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40027638_16c4_4a09_891f_7f1d3524d5f2.slice/crio-23110ea90357f4e980e66419145667c50dbd5a2fd954074f171a1e6bf21c7c07 WatchSource:0}: Error finding container 23110ea90357f4e980e66419145667c50dbd5a2fd954074f171a1e6bf21c7c07: Status 404 returned error can't find the container with id 23110ea90357f4e980e66419145667c50dbd5a2fd954074f171a1e6bf21c7c07 Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.830425 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.933198 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ce59fe-b30d-42e1-a49f-5108fec386c9" path="/var/lib/kubelet/pods/d4ce59fe-b30d-42e1-a49f-5108fec386c9/volumes" Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.952127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/677b6149-3bf9-45ee-938e-783742deb6dd-secret-volume\") pod \"677b6149-3bf9-45ee-938e-783742deb6dd\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.952217 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/677b6149-3bf9-45ee-938e-783742deb6dd-config-volume\") pod \"677b6149-3bf9-45ee-938e-783742deb6dd\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.952325 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxgj2\" (UniqueName: \"kubernetes.io/projected/677b6149-3bf9-45ee-938e-783742deb6dd-kube-api-access-nxgj2\") pod \"677b6149-3bf9-45ee-938e-783742deb6dd\" (UID: \"677b6149-3bf9-45ee-938e-783742deb6dd\") " Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.953647 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677b6149-3bf9-45ee-938e-783742deb6dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "677b6149-3bf9-45ee-938e-783742deb6dd" (UID: "677b6149-3bf9-45ee-938e-783742deb6dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.959916 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677b6149-3bf9-45ee-938e-783742deb6dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "677b6149-3bf9-45ee-938e-783742deb6dd" (UID: "677b6149-3bf9-45ee-938e-783742deb6dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:03 crc kubenswrapper[4743]: I0310 15:30:03.960059 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677b6149-3bf9-45ee-938e-783742deb6dd-kube-api-access-nxgj2" (OuterVolumeSpecName: "kube-api-access-nxgj2") pod "677b6149-3bf9-45ee-938e-783742deb6dd" (UID: "677b6149-3bf9-45ee-938e-783742deb6dd"). InnerVolumeSpecName "kube-api-access-nxgj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.058355 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/677b6149-3bf9-45ee-938e-783742deb6dd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.058389 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/677b6149-3bf9-45ee-938e-783742deb6dd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.058399 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxgj2\" (UniqueName: \"kubernetes.io/projected/677b6149-3bf9-45ee-938e-783742deb6dd-kube-api-access-nxgj2\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.277699 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.366226 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6361004a-5259-434e-a899-81704ebee56c-logs\") pod \"6361004a-5259-434e-a899-81704ebee56c\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.366335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-combined-ca-bundle\") pod \"6361004a-5259-434e-a899-81704ebee56c\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.366364 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9c59\" (UniqueName: \"kubernetes.io/projected/6361004a-5259-434e-a899-81704ebee56c-kube-api-access-j9c59\") pod \"6361004a-5259-434e-a899-81704ebee56c\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.366447 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-config-data\") pod \"6361004a-5259-434e-a899-81704ebee56c\" (UID: \"6361004a-5259-434e-a899-81704ebee56c\") " Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.368645 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6361004a-5259-434e-a899-81704ebee56c-logs" (OuterVolumeSpecName: "logs") pod "6361004a-5259-434e-a899-81704ebee56c" (UID: "6361004a-5259-434e-a899-81704ebee56c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.374078 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6361004a-5259-434e-a899-81704ebee56c-kube-api-access-j9c59" (OuterVolumeSpecName: "kube-api-access-j9c59") pod "6361004a-5259-434e-a899-81704ebee56c" (UID: "6361004a-5259-434e-a899-81704ebee56c"). InnerVolumeSpecName "kube-api-access-j9c59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.403793 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6361004a-5259-434e-a899-81704ebee56c" (UID: "6361004a-5259-434e-a899-81704ebee56c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.407598 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-config-data" (OuterVolumeSpecName: "config-data") pod "6361004a-5259-434e-a899-81704ebee56c" (UID: "6361004a-5259-434e-a899-81704ebee56c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.441946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerStarted","Data":"23110ea90357f4e980e66419145667c50dbd5a2fd954074f171a1e6bf21c7c07"} Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.445714 4743 generic.go:334] "Generic (PLEG): container finished" podID="6361004a-5259-434e-a899-81704ebee56c" containerID="b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1" exitCode=0 Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.445751 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.445791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6361004a-5259-434e-a899-81704ebee56c","Type":"ContainerDied","Data":"b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1"} Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.445834 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6361004a-5259-434e-a899-81704ebee56c","Type":"ContainerDied","Data":"d7d69a532823c2c9d51f44f104785b5d553b35e3bc40a2ac4d49edf8a4e1cc89"} Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.445853 4743 scope.go:117] "RemoveContainer" containerID="b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.450274 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.450597 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v" event={"ID":"677b6149-3bf9-45ee-938e-783742deb6dd","Type":"ContainerDied","Data":"952761f32fa4a06da9656613332006f39e8ef403df12365757714bf40dad1462"} Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.450618 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952761f32fa4a06da9656613332006f39e8ef403df12365757714bf40dad1462" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.475717 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6361004a-5259-434e-a899-81704ebee56c-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.475742 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.475752 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9c59\" (UniqueName: \"kubernetes.io/projected/6361004a-5259-434e-a899-81704ebee56c-kube-api-access-j9c59\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.475763 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6361004a-5259-434e-a899-81704ebee56c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.488550 4743 scope.go:117] "RemoveContainer" containerID="1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.489448 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.499243 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.512467 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:04 crc kubenswrapper[4743]: E0310 15:30:04.512959 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677b6149-3bf9-45ee-938e-783742deb6dd" containerName="collect-profiles" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.512978 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="677b6149-3bf9-45ee-938e-783742deb6dd" containerName="collect-profiles" Mar 10 15:30:04 crc kubenswrapper[4743]: E0310 15:30:04.513017 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-log" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.513024 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-log" Mar 10 15:30:04 crc kubenswrapper[4743]: E0310 15:30:04.513038 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-api" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.513044 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-api" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.513288 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-api" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.513305 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="677b6149-3bf9-45ee-938e-783742deb6dd" containerName="collect-profiles" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.513317 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6361004a-5259-434e-a899-81704ebee56c" containerName="nova-api-log" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.514430 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.524589 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.554279 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.609026 4743 scope.go:117] "RemoveContainer" containerID="b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1" Mar 10 15:30:04 crc kubenswrapper[4743]: E0310 15:30:04.609500 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1\": container with ID starting with b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1 not found: ID does not exist" containerID="b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.609553 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1"} err="failed to get container status \"b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1\": rpc error: code = NotFound desc = could not find container \"b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1\": container with ID starting with b170c4b795909e4ba22974943b8cc6b2b1ab810cd61f3a922199891134029ab1 not found: ID does not exist" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.609580 4743 scope.go:117] "RemoveContainer" containerID="1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15" Mar 10 15:30:04 crc kubenswrapper[4743]: E0310 15:30:04.610063 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15\": container with ID starting with 1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15 not found: ID does not exist" containerID="1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.610110 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15"} err="failed to get container status \"1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15\": rpc error: code = NotFound desc = could not find container \"1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15\": container with ID starting with 1558a2ac4ddaf4929d2f293a72014728e91a57cb75df5b735ce12dfeda40cf15 not found: ID does not exist" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.672946 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.694576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773d2aaa-49c3-4bfe-b744-72a2a863e701-logs\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.694641 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.694720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-config-data\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.694778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzs8\" (UniqueName: \"kubernetes.io/projected/773d2aaa-49c3-4bfe-b744-72a2a863e701-kube-api-access-hxzs8\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.796581 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzs8\" (UniqueName: \"kubernetes.io/projected/773d2aaa-49c3-4bfe-b744-72a2a863e701-kube-api-access-hxzs8\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.796800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773d2aaa-49c3-4bfe-b744-72a2a863e701-logs\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.796865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.796926 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-config-data\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.798178 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773d2aaa-49c3-4bfe-b744-72a2a863e701-logs\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.803513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-config-data\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.804655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.812793 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzs8\" (UniqueName: \"kubernetes.io/projected/773d2aaa-49c3-4bfe-b744-72a2a863e701-kube-api-access-hxzs8\") pod \"nova-api-0\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " pod="openstack/nova-api-0" Mar 10 15:30:04 crc kubenswrapper[4743]: I0310 15:30:04.890393 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:05 crc kubenswrapper[4743]: I0310 15:30:05.357745 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:05 crc kubenswrapper[4743]: W0310 15:30:05.361948 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod773d2aaa_49c3_4bfe_b744_72a2a863e701.slice/crio-f5fdcda425f99c5fa400ff6a64d9509d95962766a44f5f91c58eaa24eb4ecd69 WatchSource:0}: Error finding container f5fdcda425f99c5fa400ff6a64d9509d95962766a44f5f91c58eaa24eb4ecd69: Status 404 returned error can't find the container with id f5fdcda425f99c5fa400ff6a64d9509d95962766a44f5f91c58eaa24eb4ecd69 Mar 10 15:30:05 crc kubenswrapper[4743]: I0310 15:30:05.465763 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"773d2aaa-49c3-4bfe-b744-72a2a863e701","Type":"ContainerStarted","Data":"f5fdcda425f99c5fa400ff6a64d9509d95962766a44f5f91c58eaa24eb4ecd69"} Mar 10 15:30:05 crc kubenswrapper[4743]: I0310 15:30:05.470921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerStarted","Data":"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42"} Mar 10 15:30:05 crc kubenswrapper[4743]: I0310 15:30:05.470956 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerStarted","Data":"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230"} Mar 10 15:30:05 crc kubenswrapper[4743]: I0310 15:30:05.473827 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c8a28f7-44f1-4871-bff9-1d64242a7f5e" containerID="455a255e318e3164ee52d227a1eb2481f1f6683044946c4e16ddfb8dfe57566e" exitCode=0 Mar 10 15:30:05 crc kubenswrapper[4743]: I0310 15:30:05.473865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552610-k7w8j" event={"ID":"7c8a28f7-44f1-4871-bff9-1d64242a7f5e","Type":"ContainerDied","Data":"455a255e318e3164ee52d227a1eb2481f1f6683044946c4e16ddfb8dfe57566e"} Mar 10 15:30:05 crc kubenswrapper[4743]: I0310 15:30:05.933602 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6361004a-5259-434e-a899-81704ebee56c" path="/var/lib/kubelet/pods/6361004a-5259-434e-a899-81704ebee56c/volumes" Mar 10 15:30:06 crc kubenswrapper[4743]: I0310 15:30:06.485659 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"773d2aaa-49c3-4bfe-b744-72a2a863e701","Type":"ContainerStarted","Data":"3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84"} Mar 10 15:30:06 crc kubenswrapper[4743]: I0310 15:30:06.486697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"773d2aaa-49c3-4bfe-b744-72a2a863e701","Type":"ContainerStarted","Data":"7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9"} Mar 10 15:30:06 crc kubenswrapper[4743]: I0310 15:30:06.488530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerStarted","Data":"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61"} Mar 10 15:30:06 crc kubenswrapper[4743]: I0310 15:30:06.510694 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5106579719999997 podStartE2EDuration="2.510657972s" podCreationTimestamp="2026-03-10 15:30:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:06.507944294 +0000 UTC m=+1471.214759042" watchObservedRunningTime="2026-03-10 15:30:06.510657972 +0000 UTC m=+1471.217472770" Mar 10 15:30:06 crc kubenswrapper[4743]: I0310 15:30:06.954294 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 15:30:06 crc kubenswrapper[4743]: I0310 15:30:06.964600 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552610-k7w8j" Mar 10 15:30:07 crc kubenswrapper[4743]: I0310 15:30:07.050673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrztd\" (UniqueName: \"kubernetes.io/projected/7c8a28f7-44f1-4871-bff9-1d64242a7f5e-kube-api-access-jrztd\") pod \"7c8a28f7-44f1-4871-bff9-1d64242a7f5e\" (UID: \"7c8a28f7-44f1-4871-bff9-1d64242a7f5e\") " Mar 10 15:30:07 crc kubenswrapper[4743]: I0310 15:30:07.057200 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8a28f7-44f1-4871-bff9-1d64242a7f5e-kube-api-access-jrztd" (OuterVolumeSpecName: "kube-api-access-jrztd") pod "7c8a28f7-44f1-4871-bff9-1d64242a7f5e" (UID: "7c8a28f7-44f1-4871-bff9-1d64242a7f5e"). InnerVolumeSpecName "kube-api-access-jrztd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:07 crc kubenswrapper[4743]: I0310 15:30:07.152894 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrztd\" (UniqueName: \"kubernetes.io/projected/7c8a28f7-44f1-4871-bff9-1d64242a7f5e-kube-api-access-jrztd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:07 crc kubenswrapper[4743]: I0310 15:30:07.264174 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:30:07 crc kubenswrapper[4743]: I0310 15:30:07.265024 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:30:07 crc kubenswrapper[4743]: I0310 15:30:07.500143 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552610-k7w8j" Mar 10 15:30:07 crc kubenswrapper[4743]: I0310 15:30:07.512807 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552610-k7w8j" event={"ID":"7c8a28f7-44f1-4871-bff9-1d64242a7f5e","Type":"ContainerDied","Data":"64389d53dac5c949887d216d46d8ffa6983f26a858feb4b53ddaf58e1c76e958"} Mar 10 15:30:07 crc kubenswrapper[4743]: I0310 15:30:07.512955 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64389d53dac5c949887d216d46d8ffa6983f26a858feb4b53ddaf58e1c76e958" Mar 10 15:30:08 crc kubenswrapper[4743]: I0310 15:30:08.039941 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552604-lrmxz"] Mar 10 15:30:08 crc kubenswrapper[4743]: I0310 15:30:08.049854 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552604-lrmxz"] Mar 10 15:30:08 crc kubenswrapper[4743]: I0310 15:30:08.516573 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerStarted","Data":"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793"} Mar 10 15:30:08 crc kubenswrapper[4743]: I0310 15:30:08.517018 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:30:08 crc kubenswrapper[4743]: I0310 15:30:08.562861 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.281328911 podStartE2EDuration="6.562788854s" podCreationTimestamp="2026-03-10 15:30:02 +0000 UTC" firstStartedPulling="2026-03-10 15:30:03.546193475 +0000 UTC m=+1468.253008223" lastFinishedPulling="2026-03-10 15:30:07.827653398 +0000 UTC m=+1472.534468166" observedRunningTime="2026-03-10 15:30:08.553330033 +0000 UTC m=+1473.260144841" watchObservedRunningTime="2026-03-10 15:30:08.562788854 +0000 UTC m=+1473.269603642" Mar 10 15:30:08 crc kubenswrapper[4743]: I0310 15:30:08.794844 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 15:30:09 crc kubenswrapper[4743]: I0310 15:30:09.935173 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a6bc3c-7fe6-4c98-b45a-01296d21caf4" path="/var/lib/kubelet/pods/e1a6bc3c-7fe6-4c98-b45a-01296d21caf4/volumes" Mar 10 15:30:11 crc kubenswrapper[4743]: I0310 15:30:11.954718 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 15:30:12 crc kubenswrapper[4743]: I0310 15:30:12.014952 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 15:30:12 crc kubenswrapper[4743]: I0310 15:30:12.264246 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 15:30:12 crc kubenswrapper[4743]: I0310 15:30:12.264633 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 15:30:12 crc kubenswrapper[4743]: I0310 15:30:12.608089 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 15:30:13 crc kubenswrapper[4743]: I0310 15:30:13.280032 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 15:30:13 crc kubenswrapper[4743]: I0310 15:30:13.280046 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:30:14 crc kubenswrapper[4743]: I0310 15:30:14.891210 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:30:14 crc kubenswrapper[4743]: I0310 15:30:14.891525 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:30:15 crc kubenswrapper[4743]: I0310 15:30:15.973025 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.227:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 15:30:15 crc kubenswrapper[4743]: I0310 15:30:15.973435 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.227:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.271768 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.277314 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.280024 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.670392 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.689271 4743 generic.go:334] "Generic (PLEG): container finished" podID="98838081-e32b-46ae-b757-72abda3f9737" containerID="d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7" exitCode=137 Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.689312 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.689348 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"98838081-e32b-46ae-b757-72abda3f9737","Type":"ContainerDied","Data":"d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7"} Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.689402 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"98838081-e32b-46ae-b757-72abda3f9737","Type":"ContainerDied","Data":"5c4cc2b54ea06aab996e8938d05f039902f21a6e85ff01f8c7d92dcb72bc5a0a"} Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.689423 4743 scope.go:117] "RemoveContainer" containerID="d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.699567 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.728929 4743 scope.go:117] "RemoveContainer" containerID="d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7" Mar 10 15:30:22 crc kubenswrapper[4743]: E0310 15:30:22.732141 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7\": container with ID starting with d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7 not found: ID does not exist" containerID="d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.732197 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7"} err="failed to get container status \"d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7\": rpc error: code = NotFound desc = could not find container \"d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7\": container with ID starting with d0c8c98db100dc7f44db20eb815808b55fafdee9f679a94c912544066c7850c7 not found: ID does not exist" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.735025 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-combined-ca-bundle\") pod \"98838081-e32b-46ae-b757-72abda3f9737\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.735145 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65nwx\" (UniqueName: \"kubernetes.io/projected/98838081-e32b-46ae-b757-72abda3f9737-kube-api-access-65nwx\") pod \"98838081-e32b-46ae-b757-72abda3f9737\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.735178 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-config-data\") pod \"98838081-e32b-46ae-b757-72abda3f9737\" (UID: \"98838081-e32b-46ae-b757-72abda3f9737\") " Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.759389 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98838081-e32b-46ae-b757-72abda3f9737-kube-api-access-65nwx" (OuterVolumeSpecName: "kube-api-access-65nwx") pod "98838081-e32b-46ae-b757-72abda3f9737" (UID: "98838081-e32b-46ae-b757-72abda3f9737"). InnerVolumeSpecName "kube-api-access-65nwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.786609 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98838081-e32b-46ae-b757-72abda3f9737" (UID: "98838081-e32b-46ae-b757-72abda3f9737"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.811162 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-config-data" (OuterVolumeSpecName: "config-data") pod "98838081-e32b-46ae-b757-72abda3f9737" (UID: "98838081-e32b-46ae-b757-72abda3f9737"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.837517 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.837551 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65nwx\" (UniqueName: \"kubernetes.io/projected/98838081-e32b-46ae-b757-72abda3f9737-kube-api-access-65nwx\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:22 crc kubenswrapper[4743]: I0310 15:30:22.837565 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98838081-e32b-46ae-b757-72abda3f9737-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.022911 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.033125 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.060236 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:30:23 crc kubenswrapper[4743]: E0310 15:30:23.060764 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8a28f7-44f1-4871-bff9-1d64242a7f5e" containerName="oc" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.060783 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8a28f7-44f1-4871-bff9-1d64242a7f5e" containerName="oc" Mar 10 15:30:23 crc kubenswrapper[4743]: E0310 15:30:23.060825 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98838081-e32b-46ae-b757-72abda3f9737" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.060838 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="98838081-e32b-46ae-b757-72abda3f9737" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.061067 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="98838081-e32b-46ae-b757-72abda3f9737" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.061098 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8a28f7-44f1-4871-bff9-1d64242a7f5e" containerName="oc" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.062436 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.068782 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.070419 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.071327 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.075210 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.145314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.145462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.145562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8kgc\" (UniqueName: \"kubernetes.io/projected/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-kube-api-access-v8kgc\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.145724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.145831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.247406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.247489 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.247525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8kgc\" (UniqueName: \"kubernetes.io/projected/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-kube-api-access-v8kgc\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.247637 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.247751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.252896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.254138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.255477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.256008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.265830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8kgc\" (UniqueName: \"kubernetes.io/projected/fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d-kube-api-access-v8kgc\") pod \"nova-cell1-novncproxy-0\" (UID: \"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.385731 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:23 crc kubenswrapper[4743]: W0310 15:30:23.701319 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe5569d1_9407_4a10_bfad_6a9f8f2b6e3d.slice/crio-1fe9cc8257e58f54c906788ebdc62b1e16cab45e5072c904a65599c988a5a795 WatchSource:0}: Error finding container 1fe9cc8257e58f54c906788ebdc62b1e16cab45e5072c904a65599c988a5a795: Status 404 returned error can't find the container with id 1fe9cc8257e58f54c906788ebdc62b1e16cab45e5072c904a65599c988a5a795 Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.702729 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:30:23 crc kubenswrapper[4743]: I0310 15:30:23.930840 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98838081-e32b-46ae-b757-72abda3f9737" path="/var/lib/kubelet/pods/98838081-e32b-46ae-b757-72abda3f9737/volumes" Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.714132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d","Type":"ContainerStarted","Data":"5ab2b06778cb28f3da8efc351c0941de28166053f61ff1e82e448f3ed3a8f320"} Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.714215 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d","Type":"ContainerStarted","Data":"1fe9cc8257e58f54c906788ebdc62b1e16cab45e5072c904a65599c988a5a795"} Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.748259 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.748239275 podStartE2EDuration="1.748239275s" podCreationTimestamp="2026-03-10 15:30:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:24.738646401 +0000 UTC m=+1489.445461169" watchObservedRunningTime="2026-03-10 15:30:24.748239275 +0000 UTC m=+1489.455054023" Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.894981 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.895118 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.895983 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.896025 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.906544 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 15:30:24 crc kubenswrapper[4743]: I0310 15:30:24.909413 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.124789 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-4297n"] Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.130950 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.141676 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-4297n"] Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.191965 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-sb\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.192032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-config\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.192073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-swift-storage-0\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.192251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-svc\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.192491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qpst\" (UniqueName: \"kubernetes.io/projected/17052016-cb68-4e74-82e3-05531596e17e-kube-api-access-7qpst\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.192595 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-nb\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.293395 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-nb\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.293508 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-sb\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.293578 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-config\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.293615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-swift-storage-0\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.293648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-svc\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.293722 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qpst\" (UniqueName: \"kubernetes.io/projected/17052016-cb68-4e74-82e3-05531596e17e-kube-api-access-7qpst\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.294718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-config\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.294740 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-svc\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.294724 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-nb\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.294983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-sb\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.295325 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-swift-storage-0\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.318123 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qpst\" (UniqueName: \"kubernetes.io/projected/17052016-cb68-4e74-82e3-05531596e17e-kube-api-access-7qpst\") pod \"dnsmasq-dns-5958d5dc75-4297n\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:25 crc kubenswrapper[4743]: I0310 15:30:25.486926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:26 crc kubenswrapper[4743]: I0310 15:30:26.094994 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-4297n"] Mar 10 15:30:26 crc kubenswrapper[4743]: I0310 15:30:26.731290 4743 generic.go:334] "Generic (PLEG): container finished" podID="17052016-cb68-4e74-82e3-05531596e17e" containerID="77c33115a71c71745d87fcad68674d5d2dfb17a36aae322113646b810ccf6e40" exitCode=0 Mar 10 15:30:26 crc kubenswrapper[4743]: I0310 15:30:26.731381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" event={"ID":"17052016-cb68-4e74-82e3-05531596e17e","Type":"ContainerDied","Data":"77c33115a71c71745d87fcad68674d5d2dfb17a36aae322113646b810ccf6e40"} Mar 10 15:30:26 crc kubenswrapper[4743]: I0310 15:30:26.731911 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" event={"ID":"17052016-cb68-4e74-82e3-05531596e17e","Type":"ContainerStarted","Data":"b569082d7dfcd66dd4c11924ed9ae7395a2af7c83d27be3dc2553186be069683"} Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.383070 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.383746 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="ceilometer-central-agent" containerID="cri-o://3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230" gracePeriod=30 Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.385111 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="sg-core" containerID="cri-o://6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61" gracePeriod=30 Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.385136 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="proxy-httpd" containerID="cri-o://b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793" gracePeriod=30 Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.385139 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="ceilometer-notification-agent" containerID="cri-o://b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42" gracePeriod=30 Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.391982 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.226:3000/\": read tcp 10.217.0.2:46386->10.217.0.226:3000: read: connection reset by peer" Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.750139 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" event={"ID":"17052016-cb68-4e74-82e3-05531596e17e","Type":"ContainerStarted","Data":"01d944739ade9c3156cc5e5e277939e2b77016237037ffdde1f2ecc1ceb56bef"} Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.750261 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.754433 4743 generic.go:334] "Generic (PLEG): container finished" podID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerID="b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793" exitCode=0 Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.754727 4743 generic.go:334] "Generic (PLEG): container finished" podID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerID="6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61" exitCode=2 Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.754754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerDied","Data":"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793"} Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.754784 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.754805 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerDied","Data":"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61"} Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.755022 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-log" containerID="cri-o://7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9" gracePeriod=30 Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.755151 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-api" containerID="cri-o://3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84" gracePeriod=30 Mar 10 15:30:27 crc kubenswrapper[4743]: I0310 15:30:27.784003 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" podStartSLOduration=2.783971682 podStartE2EDuration="2.783971682s" podCreationTimestamp="2026-03-10 15:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:27.777874697 +0000 UTC m=+1492.484689445" watchObservedRunningTime="2026-03-10 15:30:27.783971682 +0000 UTC m=+1492.490786420" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.247012 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.355453 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-config-data\") pod \"40027638-16c4-4a09-891f-7f1d3524d5f2\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.355500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-log-httpd\") pod \"40027638-16c4-4a09-891f-7f1d3524d5f2\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.355667 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-scripts\") pod \"40027638-16c4-4a09-891f-7f1d3524d5f2\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.355709 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-run-httpd\") pod \"40027638-16c4-4a09-891f-7f1d3524d5f2\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.355878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-combined-ca-bundle\") pod \"40027638-16c4-4a09-891f-7f1d3524d5f2\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.355967 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-sg-core-conf-yaml\") pod \"40027638-16c4-4a09-891f-7f1d3524d5f2\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.355993 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j29gn\" (UniqueName: \"kubernetes.io/projected/40027638-16c4-4a09-891f-7f1d3524d5f2-kube-api-access-j29gn\") pod \"40027638-16c4-4a09-891f-7f1d3524d5f2\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.356025 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-ceilometer-tls-certs\") pod \"40027638-16c4-4a09-891f-7f1d3524d5f2\" (UID: \"40027638-16c4-4a09-891f-7f1d3524d5f2\") " Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.356490 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "40027638-16c4-4a09-891f-7f1d3524d5f2" (UID: "40027638-16c4-4a09-891f-7f1d3524d5f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.356653 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "40027638-16c4-4a09-891f-7f1d3524d5f2" (UID: "40027638-16c4-4a09-891f-7f1d3524d5f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.362483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-scripts" (OuterVolumeSpecName: "scripts") pod "40027638-16c4-4a09-891f-7f1d3524d5f2" (UID: "40027638-16c4-4a09-891f-7f1d3524d5f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.366698 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40027638-16c4-4a09-891f-7f1d3524d5f2-kube-api-access-j29gn" (OuterVolumeSpecName: "kube-api-access-j29gn") pod "40027638-16c4-4a09-891f-7f1d3524d5f2" (UID: "40027638-16c4-4a09-891f-7f1d3524d5f2"). InnerVolumeSpecName "kube-api-access-j29gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.386415 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.404093 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "40027638-16c4-4a09-891f-7f1d3524d5f2" (UID: "40027638-16c4-4a09-891f-7f1d3524d5f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.431915 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "40027638-16c4-4a09-891f-7f1d3524d5f2" (UID: "40027638-16c4-4a09-891f-7f1d3524d5f2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.458149 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.458186 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j29gn\" (UniqueName: \"kubernetes.io/projected/40027638-16c4-4a09-891f-7f1d3524d5f2-kube-api-access-j29gn\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.458201 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.458212 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.458223 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.458235 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40027638-16c4-4a09-891f-7f1d3524d5f2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.465013 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40027638-16c4-4a09-891f-7f1d3524d5f2" (UID: "40027638-16c4-4a09-891f-7f1d3524d5f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.497621 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-config-data" (OuterVolumeSpecName: "config-data") pod "40027638-16c4-4a09-891f-7f1d3524d5f2" (UID: "40027638-16c4-4a09-891f-7f1d3524d5f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.560636 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.560681 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40027638-16c4-4a09-891f-7f1d3524d5f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.774801 4743 generic.go:334] "Generic (PLEG): container finished" podID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerID="7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9" exitCode=143 Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.774926 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"773d2aaa-49c3-4bfe-b744-72a2a863e701","Type":"ContainerDied","Data":"7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9"} Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.778082 4743 generic.go:334] "Generic (PLEG): container finished" podID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerID="b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42" exitCode=0 Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.778102 4743 generic.go:334] "Generic (PLEG): container finished" podID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerID="3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230" exitCode=0 Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.778171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerDied","Data":"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42"} Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.778191 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.778251 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerDied","Data":"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230"} Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.778309 4743 scope.go:117] "RemoveContainer" containerID="b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.778511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40027638-16c4-4a09-891f-7f1d3524d5f2","Type":"ContainerDied","Data":"23110ea90357f4e980e66419145667c50dbd5a2fd954074f171a1e6bf21c7c07"} Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.819143 4743 scope.go:117] "RemoveContainer" containerID="6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.830761 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.844404 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.845544 4743 scope.go:117] "RemoveContainer" containerID="b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.873140 4743 scope.go:117] "RemoveContainer" containerID="3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.874680 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:28 crc kubenswrapper[4743]: E0310 15:30:28.886557 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="ceilometer-notification-agent" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.886601 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="ceilometer-notification-agent" Mar 10 15:30:28 crc kubenswrapper[4743]: E0310 15:30:28.886634 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="ceilometer-central-agent" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.886643 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="ceilometer-central-agent" Mar 10 15:30:28 crc kubenswrapper[4743]: E0310 15:30:28.886701 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="proxy-httpd" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.886709 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="proxy-httpd" Mar 10 15:30:28 crc kubenswrapper[4743]: E0310 15:30:28.886728 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="sg-core" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.886735 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="sg-core" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.891907 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="ceilometer-central-agent" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.891959 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="proxy-httpd" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.891981 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="ceilometer-notification-agent" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.892001 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" containerName="sg-core" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.899764 4743 scope.go:117] "RemoveContainer" containerID="b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793" Mar 10 15:30:28 crc kubenswrapper[4743]: E0310 15:30:28.900227 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793\": container with ID starting with b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793 not found: ID does not exist" containerID="b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.900380 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793"} err="failed to get container status \"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793\": rpc error: code = NotFound desc = could not find container \"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793\": container with ID starting with b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793 not found: ID does not exist" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.900475 4743 scope.go:117] "RemoveContainer" containerID="6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.900688 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.900982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: E0310 15:30:28.901017 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61\": container with ID starting with 6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61 not found: ID does not exist" containerID="6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.901043 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61"} err="failed to get container status \"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61\": rpc error: code = NotFound desc = could not find container \"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61\": container with ID starting with 6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61 not found: ID does not exist" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.901065 4743 scope.go:117] "RemoveContainer" containerID="b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42" Mar 10 15:30:28 crc kubenswrapper[4743]: E0310 15:30:28.901274 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42\": container with ID starting with b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42 not found: ID does not exist" containerID="b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.901299 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42"} err="failed to get container status \"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42\": rpc error: code = NotFound desc = could not find container \"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42\": container with ID starting with b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42 not found: ID does not exist" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.901318 4743 scope.go:117] "RemoveContainer" containerID="3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230" Mar 10 15:30:28 crc kubenswrapper[4743]: E0310 15:30:28.901508 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230\": container with ID starting with 3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230 not found: ID does not exist" containerID="3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.901531 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230"} err="failed to get container status \"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230\": rpc error: code = NotFound desc = could not find container \"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230\": container with ID starting with 3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230 not found: ID does not exist" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.901547 4743 scope.go:117] "RemoveContainer" containerID="b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.901732 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793"} err="failed to get container status \"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793\": rpc error: code = NotFound desc = could not find container \"b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793\": container with ID starting with b80c72ad6b9af8f6c5cc19eeb01e52e04a7ba7c7f7582b31bbbb684731e2c793 not found: ID does not exist" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.901754 4743 scope.go:117] "RemoveContainer" containerID="6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.902853 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61"} err="failed to get container status \"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61\": rpc error: code = NotFound desc = could not find container \"6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61\": container with ID starting with 6dc69e085a57ba213922cbec5dea824285bcc6e666a949ccae3ab07b83dc7c61 not found: ID does not exist" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.902874 4743 scope.go:117] "RemoveContainer" containerID="b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.906981 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42"} err="failed to get container status \"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42\": rpc error: code = NotFound desc = could not find container \"b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42\": container with ID starting with b8f2dbccf24c6475855cce8bbddbe4b5ca79a0b730e794fc16bce073ad060a42 not found: ID does not exist" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.907022 4743 scope.go:117] "RemoveContainer" containerID="3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.907834 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.907962 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.908575 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.912465 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230"} err="failed to get container status \"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230\": rpc error: code = NotFound desc = could not find container \"3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230\": container with ID starting with 3e25d8590d6eed65d4a08b2beb271ade2d4badc316cffc1cab6e85c257d63230 not found: ID does not exist" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.983086 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-log-httpd\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.983185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.983251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhd64\" (UniqueName: \"kubernetes.io/projected/aad87b0f-ec79-4849-84a3-d18545d44913-kube-api-access-bhd64\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.983315 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.983340 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.983370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-scripts\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.983402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-config-data\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:28 crc kubenswrapper[4743]: I0310 15:30:28.983440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-run-httpd\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.085039 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.085098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.085124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-scripts\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.085153 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-config-data\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.085193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-run-httpd\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.085243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-log-httpd\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.085292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.085368 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhd64\" (UniqueName: \"kubernetes.io/projected/aad87b0f-ec79-4849-84a3-d18545d44913-kube-api-access-bhd64\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.086943 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-log-httpd\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.086970 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-run-httpd\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.090750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.091156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.091702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-config-data\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.092028 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-scripts\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.092444 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.103621 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhd64\" (UniqueName: \"kubernetes.io/projected/aad87b0f-ec79-4849-84a3-d18545d44913-kube-api-access-bhd64\") pod \"ceilometer-0\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.228149 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.373782 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:29 crc kubenswrapper[4743]: W0310 15:30:29.740266 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaad87b0f_ec79_4849_84a3_d18545d44913.slice/crio-0c05a5aeab009dec472ad6978dfd3ae3d0f2b8274c9715f430a5120adaba11dc WatchSource:0}: Error finding container 0c05a5aeab009dec472ad6978dfd3ae3d0f2b8274c9715f430a5120adaba11dc: Status 404 returned error can't find the container with id 0c05a5aeab009dec472ad6978dfd3ae3d0f2b8274c9715f430a5120adaba11dc Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.743730 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.793623 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerStarted","Data":"0c05a5aeab009dec472ad6978dfd3ae3d0f2b8274c9715f430a5120adaba11dc"} Mar 10 15:30:29 crc kubenswrapper[4743]: I0310 15:30:29.925504 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40027638-16c4-4a09-891f-7f1d3524d5f2" path="/var/lib/kubelet/pods/40027638-16c4-4a09-891f-7f1d3524d5f2/volumes" Mar 10 15:30:30 crc kubenswrapper[4743]: I0310 15:30:30.821219 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerStarted","Data":"9ec33bec05981d191ddc65a9f33543638d0c0da69f928f04daf510fc95e1095a"} Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.416189 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.533664 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-config-data\") pod \"773d2aaa-49c3-4bfe-b744-72a2a863e701\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.533737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773d2aaa-49c3-4bfe-b744-72a2a863e701-logs\") pod \"773d2aaa-49c3-4bfe-b744-72a2a863e701\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.534005 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxzs8\" (UniqueName: \"kubernetes.io/projected/773d2aaa-49c3-4bfe-b744-72a2a863e701-kube-api-access-hxzs8\") pod \"773d2aaa-49c3-4bfe-b744-72a2a863e701\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.534061 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-combined-ca-bundle\") pod \"773d2aaa-49c3-4bfe-b744-72a2a863e701\" (UID: \"773d2aaa-49c3-4bfe-b744-72a2a863e701\") " Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.534837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773d2aaa-49c3-4bfe-b744-72a2a863e701-logs" (OuterVolumeSpecName: "logs") pod "773d2aaa-49c3-4bfe-b744-72a2a863e701" (UID: "773d2aaa-49c3-4bfe-b744-72a2a863e701"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.547971 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773d2aaa-49c3-4bfe-b744-72a2a863e701-kube-api-access-hxzs8" (OuterVolumeSpecName: "kube-api-access-hxzs8") pod "773d2aaa-49c3-4bfe-b744-72a2a863e701" (UID: "773d2aaa-49c3-4bfe-b744-72a2a863e701"). InnerVolumeSpecName "kube-api-access-hxzs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.576943 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-config-data" (OuterVolumeSpecName: "config-data") pod "773d2aaa-49c3-4bfe-b744-72a2a863e701" (UID: "773d2aaa-49c3-4bfe-b744-72a2a863e701"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.590399 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "773d2aaa-49c3-4bfe-b744-72a2a863e701" (UID: "773d2aaa-49c3-4bfe-b744-72a2a863e701"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.636977 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxzs8\" (UniqueName: \"kubernetes.io/projected/773d2aaa-49c3-4bfe-b744-72a2a863e701-kube-api-access-hxzs8\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.637013 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.637022 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773d2aaa-49c3-4bfe-b744-72a2a863e701-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.637031 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773d2aaa-49c3-4bfe-b744-72a2a863e701-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.833538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerStarted","Data":"eb9619f16dedfd871017a0ee0be24d182824d094961059ba672e22b59ddd07e3"} Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.835720 4743 generic.go:334] "Generic (PLEG): container finished" podID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerID="3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84" exitCode=0 Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.835762 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"773d2aaa-49c3-4bfe-b744-72a2a863e701","Type":"ContainerDied","Data":"3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84"} Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.835785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"773d2aaa-49c3-4bfe-b744-72a2a863e701","Type":"ContainerDied","Data":"f5fdcda425f99c5fa400ff6a64d9509d95962766a44f5f91c58eaa24eb4ecd69"} Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.835783 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.835803 4743 scope.go:117] "RemoveContainer" containerID="3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.887879 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.889845 4743 scope.go:117] "RemoveContainer" containerID="7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.895103 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.910263 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:31 crc kubenswrapper[4743]: E0310 15:30:31.910700 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-log" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.910716 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-log" Mar 10 15:30:31 crc kubenswrapper[4743]: E0310 15:30:31.910746 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-api" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.910755 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-api" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.910974 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-api" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.911000 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" containerName="nova-api-log" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.911958 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.919823 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.920077 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.920238 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.920903 4743 scope.go:117] "RemoveContainer" containerID="3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84" Mar 10 15:30:31 crc kubenswrapper[4743]: E0310 15:30:31.921464 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84\": container with ID starting with 3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84 not found: ID does not exist" containerID="3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.921507 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84"} err="failed to get container status \"3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84\": rpc error: code = NotFound desc = could not find container \"3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84\": container with ID starting with 3dce65c1fb1c876eaaf246d3c7573f6d556b1431b53ccf43952dc3f19f053b84 not found: ID does not exist" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.921532 4743 scope.go:117] "RemoveContainer" containerID="7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9" Mar 10 15:30:31 crc kubenswrapper[4743]: E0310 15:30:31.922268 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9\": container with ID starting with 7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9 not found: ID does not exist" containerID="7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.922297 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9"} err="failed to get container status \"7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9\": rpc error: code = NotFound desc = could not find container \"7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9\": container with ID starting with 7ef94914dde6ccf53bb3a33a0b53e569d005acc08eae0b95e4a66c2d5a9bb6b9 not found: ID does not exist" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.933995 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="773d2aaa-49c3-4bfe-b744-72a2a863e701" path="/var/lib/kubelet/pods/773d2aaa-49c3-4bfe-b744-72a2a863e701/volumes" Mar 10 15:30:31 crc kubenswrapper[4743]: I0310 15:30:31.959304 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.045185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.045285 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klkmk\" (UniqueName: \"kubernetes.io/projected/fa232077-e977-4b86-94fa-026a7630c4f2-kube-api-access-klkmk\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.045317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.046414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.046489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa232077-e977-4b86-94fa-026a7630c4f2-logs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.046586 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-config-data\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.148155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-config-data\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.148254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.148310 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klkmk\" (UniqueName: \"kubernetes.io/projected/fa232077-e977-4b86-94fa-026a7630c4f2-kube-api-access-klkmk\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.148331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.148415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.148440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa232077-e977-4b86-94fa-026a7630c4f2-logs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.148928 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa232077-e977-4b86-94fa-026a7630c4f2-logs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.155283 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.155305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.155505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-config-data\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.156324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.168142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klkmk\" (UniqueName: \"kubernetes.io/projected/fa232077-e977-4b86-94fa-026a7630c4f2-kube-api-access-klkmk\") pod \"nova-api-0\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.244341 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:32 crc kubenswrapper[4743]: W0310 15:30:32.723186 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa232077_e977_4b86_94fa_026a7630c4f2.slice/crio-2e118909daddfd015532110a3b7596fd82d71c4751b38e57c23781336455acfc WatchSource:0}: Error finding container 2e118909daddfd015532110a3b7596fd82d71c4751b38e57c23781336455acfc: Status 404 returned error can't find the container with id 2e118909daddfd015532110a3b7596fd82d71c4751b38e57c23781336455acfc Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.732609 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.854156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerStarted","Data":"003a6273a2c7f6b7b17d05b1fe27a57499610eee96ec01fc387be14adbe335e7"} Mar 10 15:30:32 crc kubenswrapper[4743]: I0310 15:30:32.855922 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa232077-e977-4b86-94fa-026a7630c4f2","Type":"ContainerStarted","Data":"2e118909daddfd015532110a3b7596fd82d71c4751b38e57c23781336455acfc"} Mar 10 15:30:33 crc kubenswrapper[4743]: I0310 15:30:33.387446 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:33 crc kubenswrapper[4743]: I0310 15:30:33.414369 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:33 crc kubenswrapper[4743]: I0310 15:30:33.868222 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa232077-e977-4b86-94fa-026a7630c4f2","Type":"ContainerStarted","Data":"faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709"} Mar 10 15:30:33 crc kubenswrapper[4743]: I0310 15:30:33.868652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa232077-e977-4b86-94fa-026a7630c4f2","Type":"ContainerStarted","Data":"a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb"} Mar 10 15:30:33 crc kubenswrapper[4743]: I0310 15:30:33.897513 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:30:33 crc kubenswrapper[4743]: I0310 15:30:33.898889 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.898871668 podStartE2EDuration="2.898871668s" podCreationTimestamp="2026-03-10 15:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:33.891733863 +0000 UTC m=+1498.598548611" watchObservedRunningTime="2026-03-10 15:30:33.898871668 +0000 UTC m=+1498.605686416" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.119440 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gf5kp"] Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.121309 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.124307 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.127839 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.133632 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gf5kp"] Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.200240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-config-data\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.200457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsdvf\" (UniqueName: \"kubernetes.io/projected/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-kube-api-access-nsdvf\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.200584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.200878 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-scripts\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.304367 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsdvf\" (UniqueName: \"kubernetes.io/projected/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-kube-api-access-nsdvf\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.304797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.304893 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-scripts\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.304934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-config-data\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.311493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-scripts\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.314558 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-config-data\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.314975 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.324267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsdvf\" (UniqueName: \"kubernetes.io/projected/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-kube-api-access-nsdvf\") pod \"nova-cell1-cell-mapping-gf5kp\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.445874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.880233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerStarted","Data":"85f3bbc6265ef1cb94a1c6aae9abf818416b525f7df8c356a5ab0ec74b554a5d"} Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.880310 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="ceilometer-central-agent" containerID="cri-o://9ec33bec05981d191ddc65a9f33543638d0c0da69f928f04daf510fc95e1095a" gracePeriod=30 Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.880729 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="proxy-httpd" containerID="cri-o://85f3bbc6265ef1cb94a1c6aae9abf818416b525f7df8c356a5ab0ec74b554a5d" gracePeriod=30 Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.880964 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="sg-core" containerID="cri-o://003a6273a2c7f6b7b17d05b1fe27a57499610eee96ec01fc387be14adbe335e7" gracePeriod=30 Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.881033 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.881115 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="ceilometer-notification-agent" containerID="cri-o://eb9619f16dedfd871017a0ee0be24d182824d094961059ba672e22b59ddd07e3" gracePeriod=30 Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.918206 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.755405629 podStartE2EDuration="6.918186205s" podCreationTimestamp="2026-03-10 15:30:28 +0000 UTC" firstStartedPulling="2026-03-10 15:30:29.743258707 +0000 UTC m=+1494.450073445" lastFinishedPulling="2026-03-10 15:30:33.906039273 +0000 UTC m=+1498.612854021" observedRunningTime="2026-03-10 15:30:34.905989486 +0000 UTC m=+1499.612804234" watchObservedRunningTime="2026-03-10 15:30:34.918186205 +0000 UTC m=+1499.625000953" Mar 10 15:30:34 crc kubenswrapper[4743]: I0310 15:30:34.967607 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gf5kp"] Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.489266 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.599970 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-2zfwg"] Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.600282 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" podUID="b5cf5878-917c-4aa7-b209-37045ef1dc13" containerName="dnsmasq-dns" containerID="cri-o://2ee6c40424e960f0b7645ad5e92e770a52224861aefdedb07c9a0a706ff062f5" gracePeriod=10 Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.891072 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gf5kp" event={"ID":"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3","Type":"ContainerStarted","Data":"87c551d40a377cf99116a7b432d9abad33977a05937958fecd64640cf7c067e2"} Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.891332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gf5kp" event={"ID":"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3","Type":"ContainerStarted","Data":"da43995a8038a65ed17516fc226e0bd51183b8f11f84ca72af5ce0d2c6bdee27"} Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.894685 4743 generic.go:334] "Generic (PLEG): container finished" podID="aad87b0f-ec79-4849-84a3-d18545d44913" containerID="85f3bbc6265ef1cb94a1c6aae9abf818416b525f7df8c356a5ab0ec74b554a5d" exitCode=0 Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.894735 4743 generic.go:334] "Generic (PLEG): container finished" podID="aad87b0f-ec79-4849-84a3-d18545d44913" containerID="003a6273a2c7f6b7b17d05b1fe27a57499610eee96ec01fc387be14adbe335e7" exitCode=2 Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.894745 4743 generic.go:334] "Generic (PLEG): container finished" podID="aad87b0f-ec79-4849-84a3-d18545d44913" containerID="eb9619f16dedfd871017a0ee0be24d182824d094961059ba672e22b59ddd07e3" exitCode=0 Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.894759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerDied","Data":"85f3bbc6265ef1cb94a1c6aae9abf818416b525f7df8c356a5ab0ec74b554a5d"} Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.894800 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerDied","Data":"003a6273a2c7f6b7b17d05b1fe27a57499610eee96ec01fc387be14adbe335e7"} Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.894831 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerDied","Data":"eb9619f16dedfd871017a0ee0be24d182824d094961059ba672e22b59ddd07e3"} Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.897245 4743 generic.go:334] "Generic (PLEG): container finished" podID="b5cf5878-917c-4aa7-b209-37045ef1dc13" containerID="2ee6c40424e960f0b7645ad5e92e770a52224861aefdedb07c9a0a706ff062f5" exitCode=0 Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.897290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" event={"ID":"b5cf5878-917c-4aa7-b209-37045ef1dc13","Type":"ContainerDied","Data":"2ee6c40424e960f0b7645ad5e92e770a52224861aefdedb07c9a0a706ff062f5"} Mar 10 15:30:35 crc kubenswrapper[4743]: I0310 15:30:35.908179 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gf5kp" podStartSLOduration=1.908157503 podStartE2EDuration="1.908157503s" podCreationTimestamp="2026-03-10 15:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:35.902470381 +0000 UTC m=+1500.609285129" watchObservedRunningTime="2026-03-10 15:30:35.908157503 +0000 UTC m=+1500.614972251" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.729289 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.862498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-sb\") pod \"b5cf5878-917c-4aa7-b209-37045ef1dc13\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.862570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-swift-storage-0\") pod \"b5cf5878-917c-4aa7-b209-37045ef1dc13\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.862651 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsrsx\" (UniqueName: \"kubernetes.io/projected/b5cf5878-917c-4aa7-b209-37045ef1dc13-kube-api-access-tsrsx\") pod \"b5cf5878-917c-4aa7-b209-37045ef1dc13\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.862682 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-svc\") pod \"b5cf5878-917c-4aa7-b209-37045ef1dc13\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.862728 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-config\") pod \"b5cf5878-917c-4aa7-b209-37045ef1dc13\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.862952 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-nb\") pod \"b5cf5878-917c-4aa7-b209-37045ef1dc13\" (UID: \"b5cf5878-917c-4aa7-b209-37045ef1dc13\") " Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.884312 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5cf5878-917c-4aa7-b209-37045ef1dc13-kube-api-access-tsrsx" (OuterVolumeSpecName: "kube-api-access-tsrsx") pod "b5cf5878-917c-4aa7-b209-37045ef1dc13" (UID: "b5cf5878-917c-4aa7-b209-37045ef1dc13"). InnerVolumeSpecName "kube-api-access-tsrsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.912305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" event={"ID":"b5cf5878-917c-4aa7-b209-37045ef1dc13","Type":"ContainerDied","Data":"6044d5f9aec9a51fe6797ca7cef2df326356fc34b92c85cda23860411489852f"} Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.912369 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858594bc89-2zfwg" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.912451 4743 scope.go:117] "RemoveContainer" containerID="2ee6c40424e960f0b7645ad5e92e770a52224861aefdedb07c9a0a706ff062f5" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.922785 4743 generic.go:334] "Generic (PLEG): container finished" podID="aad87b0f-ec79-4849-84a3-d18545d44913" containerID="9ec33bec05981d191ddc65a9f33543638d0c0da69f928f04daf510fc95e1095a" exitCode=0 Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.922981 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerDied","Data":"9ec33bec05981d191ddc65a9f33543638d0c0da69f928f04daf510fc95e1095a"} Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.938273 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5cf5878-917c-4aa7-b209-37045ef1dc13" (UID: "b5cf5878-917c-4aa7-b209-37045ef1dc13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.939136 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5cf5878-917c-4aa7-b209-37045ef1dc13" (UID: "b5cf5878-917c-4aa7-b209-37045ef1dc13"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.943392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5cf5878-917c-4aa7-b209-37045ef1dc13" (UID: "b5cf5878-917c-4aa7-b209-37045ef1dc13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.949228 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-config" (OuterVolumeSpecName: "config") pod "b5cf5878-917c-4aa7-b209-37045ef1dc13" (UID: "b5cf5878-917c-4aa7-b209-37045ef1dc13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.955577 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5cf5878-917c-4aa7-b209-37045ef1dc13" (UID: "b5cf5878-917c-4aa7-b209-37045ef1dc13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.964902 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsrsx\" (UniqueName: \"kubernetes.io/projected/b5cf5878-917c-4aa7-b209-37045ef1dc13-kube-api-access-tsrsx\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.964944 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.964954 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.964964 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.964972 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:36 crc kubenswrapper[4743]: I0310 15:30:36.964982 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cf5878-917c-4aa7-b209-37045ef1dc13-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.007657 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.015574 4743 scope.go:117] "RemoveContainer" containerID="52ec9b9398176cd1772b646c9f6973110da9f5838155696ccecc267acf1035e5" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.066626 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhd64\" (UniqueName: \"kubernetes.io/projected/aad87b0f-ec79-4849-84a3-d18545d44913-kube-api-access-bhd64\") pod \"aad87b0f-ec79-4849-84a3-d18545d44913\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.066712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-combined-ca-bundle\") pod \"aad87b0f-ec79-4849-84a3-d18545d44913\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.066859 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-run-httpd\") pod \"aad87b0f-ec79-4849-84a3-d18545d44913\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.066902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-config-data\") pod \"aad87b0f-ec79-4849-84a3-d18545d44913\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.066934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-sg-core-conf-yaml\") pod \"aad87b0f-ec79-4849-84a3-d18545d44913\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.067078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-log-httpd\") pod \"aad87b0f-ec79-4849-84a3-d18545d44913\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.067103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-ceilometer-tls-certs\") pod \"aad87b0f-ec79-4849-84a3-d18545d44913\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.067150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-scripts\") pod \"aad87b0f-ec79-4849-84a3-d18545d44913\" (UID: \"aad87b0f-ec79-4849-84a3-d18545d44913\") " Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.071541 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aad87b0f-ec79-4849-84a3-d18545d44913" (UID: "aad87b0f-ec79-4849-84a3-d18545d44913"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.072077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad87b0f-ec79-4849-84a3-d18545d44913-kube-api-access-bhd64" (OuterVolumeSpecName: "kube-api-access-bhd64") pod "aad87b0f-ec79-4849-84a3-d18545d44913" (UID: "aad87b0f-ec79-4849-84a3-d18545d44913"). InnerVolumeSpecName "kube-api-access-bhd64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.072379 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-scripts" (OuterVolumeSpecName: "scripts") pod "aad87b0f-ec79-4849-84a3-d18545d44913" (UID: "aad87b0f-ec79-4849-84a3-d18545d44913"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.072828 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aad87b0f-ec79-4849-84a3-d18545d44913" (UID: "aad87b0f-ec79-4849-84a3-d18545d44913"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.101202 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aad87b0f-ec79-4849-84a3-d18545d44913" (UID: "aad87b0f-ec79-4849-84a3-d18545d44913"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.132835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "aad87b0f-ec79-4849-84a3-d18545d44913" (UID: "aad87b0f-ec79-4849-84a3-d18545d44913"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.151681 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aad87b0f-ec79-4849-84a3-d18545d44913" (UID: "aad87b0f-ec79-4849-84a3-d18545d44913"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.169713 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.169752 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.169768 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.169778 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhd64\" (UniqueName: \"kubernetes.io/projected/aad87b0f-ec79-4849-84a3-d18545d44913-kube-api-access-bhd64\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.169788 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.169797 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aad87b0f-ec79-4849-84a3-d18545d44913-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.169823 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.178666 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-config-data" (OuterVolumeSpecName: "config-data") pod "aad87b0f-ec79-4849-84a3-d18545d44913" (UID: "aad87b0f-ec79-4849-84a3-d18545d44913"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.253649 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-2zfwg"] Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.262474 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-2zfwg"] Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.271360 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aad87b0f-ec79-4849-84a3-d18545d44913-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.942016 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5cf5878-917c-4aa7-b209-37045ef1dc13" path="/var/lib/kubelet/pods/b5cf5878-917c-4aa7-b209-37045ef1dc13/volumes" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.961544 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aad87b0f-ec79-4849-84a3-d18545d44913","Type":"ContainerDied","Data":"0c05a5aeab009dec472ad6978dfd3ae3d0f2b8274c9715f430a5120adaba11dc"} Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.961749 4743 scope.go:117] "RemoveContainer" containerID="85f3bbc6265ef1cb94a1c6aae9abf818416b525f7df8c356a5ab0ec74b554a5d" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.961647 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.983380 4743 scope.go:117] "RemoveContainer" containerID="003a6273a2c7f6b7b17d05b1fe27a57499610eee96ec01fc387be14adbe335e7" Mar 10 15:30:37 crc kubenswrapper[4743]: I0310 15:30:37.992250 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.003880 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.008201 4743 scope.go:117] "RemoveContainer" containerID="eb9619f16dedfd871017a0ee0be24d182824d094961059ba672e22b59ddd07e3" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025130 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:38 crc kubenswrapper[4743]: E0310 15:30:38.025565 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="ceilometer-notification-agent" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025578 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="ceilometer-notification-agent" Mar 10 15:30:38 crc kubenswrapper[4743]: E0310 15:30:38.025604 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="proxy-httpd" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025610 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="proxy-httpd" Mar 10 15:30:38 crc kubenswrapper[4743]: E0310 15:30:38.025621 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cf5878-917c-4aa7-b209-37045ef1dc13" containerName="dnsmasq-dns" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025627 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cf5878-917c-4aa7-b209-37045ef1dc13" containerName="dnsmasq-dns" Mar 10 15:30:38 crc kubenswrapper[4743]: E0310 15:30:38.025637 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="sg-core" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025642 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="sg-core" Mar 10 15:30:38 crc kubenswrapper[4743]: E0310 15:30:38.025663 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="ceilometer-central-agent" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025669 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="ceilometer-central-agent" Mar 10 15:30:38 crc kubenswrapper[4743]: E0310 15:30:38.025678 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cf5878-917c-4aa7-b209-37045ef1dc13" containerName="init" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025684 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cf5878-917c-4aa7-b209-37045ef1dc13" containerName="init" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025941 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="sg-core" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025963 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="ceilometer-central-agent" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025973 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="proxy-httpd" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025980 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" containerName="ceilometer-notification-agent" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.025991 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5cf5878-917c-4aa7-b209-37045ef1dc13" containerName="dnsmasq-dns" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.027686 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.031529 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.031828 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.032450 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.041709 4743 scope.go:117] "RemoveContainer" containerID="9ec33bec05981d191ddc65a9f33543638d0c0da69f928f04daf510fc95e1095a" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.048042 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.085527 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-log-httpd\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.085611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-scripts\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.085632 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-config-data\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.085727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.085788 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.085842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swrmb\" (UniqueName: \"kubernetes.io/projected/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-kube-api-access-swrmb\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.085857 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.085882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-run-httpd\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.187965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-log-httpd\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.188096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-scripts\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.188127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-config-data\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.188217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.188257 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.188286 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swrmb\" (UniqueName: \"kubernetes.io/projected/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-kube-api-access-swrmb\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.188315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.188349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-run-httpd\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.188623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-log-httpd\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.189417 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-run-httpd\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.193363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.194060 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.194436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.195673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-config-data\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.201992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-scripts\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.206390 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swrmb\" (UniqueName: \"kubernetes.io/projected/5869f21f-5fc7-4837-b7e1-688cc16dc3ef-kube-api-access-swrmb\") pod \"ceilometer-0\" (UID: \"5869f21f-5fc7-4837-b7e1-688cc16dc3ef\") " pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.361631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:30:38 crc kubenswrapper[4743]: W0310 15:30:38.855882 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5869f21f_5fc7_4837_b7e1_688cc16dc3ef.slice/crio-508e4378b40ae6fa504712c1a4729c0c7f3dbf960d8ce9218cd38a09796a02ac WatchSource:0}: Error finding container 508e4378b40ae6fa504712c1a4729c0c7f3dbf960d8ce9218cd38a09796a02ac: Status 404 returned error can't find the container with id 508e4378b40ae6fa504712c1a4729c0c7f3dbf960d8ce9218cd38a09796a02ac Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.856247 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:30:38 crc kubenswrapper[4743]: I0310 15:30:38.975554 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5869f21f-5fc7-4837-b7e1-688cc16dc3ef","Type":"ContainerStarted","Data":"508e4378b40ae6fa504712c1a4729c0c7f3dbf960d8ce9218cd38a09796a02ac"} Mar 10 15:30:39 crc kubenswrapper[4743]: I0310 15:30:39.931094 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad87b0f-ec79-4849-84a3-d18545d44913" path="/var/lib/kubelet/pods/aad87b0f-ec79-4849-84a3-d18545d44913/volumes" Mar 10 15:30:40 crc kubenswrapper[4743]: I0310 15:30:40.010000 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5869f21f-5fc7-4837-b7e1-688cc16dc3ef","Type":"ContainerStarted","Data":"8eede294dac3d772c477dfdc234126ff3f068951dc714495153f3ce4dc6c3525"} Mar 10 15:30:41 crc kubenswrapper[4743]: I0310 15:30:41.027112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5869f21f-5fc7-4837-b7e1-688cc16dc3ef","Type":"ContainerStarted","Data":"cf88614f22b84bca088900a3ad9e9a4e64513a20acde4fd0098e46c45a8bd472"} Mar 10 15:30:41 crc kubenswrapper[4743]: I0310 15:30:41.027576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5869f21f-5fc7-4837-b7e1-688cc16dc3ef","Type":"ContainerStarted","Data":"e2cbce0fcc2c1156f3c20e1f31cb21ade0b69a7085e86bf40f178be890add33c"} Mar 10 15:30:41 crc kubenswrapper[4743]: I0310 15:30:41.031797 4743 generic.go:334] "Generic (PLEG): container finished" podID="b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" containerID="87c551d40a377cf99116a7b432d9abad33977a05937958fecd64640cf7c067e2" exitCode=0 Mar 10 15:30:41 crc kubenswrapper[4743]: I0310 15:30:41.031883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gf5kp" event={"ID":"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3","Type":"ContainerDied","Data":"87c551d40a377cf99116a7b432d9abad33977a05937958fecd64640cf7c067e2"} Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.253036 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.257311 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.624962 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.697629 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-combined-ca-bundle\") pod \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.697726 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-config-data\") pod \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.697790 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-scripts\") pod \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.698097 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsdvf\" (UniqueName: \"kubernetes.io/projected/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-kube-api-access-nsdvf\") pod \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\" (UID: \"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3\") " Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.704638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-scripts" (OuterVolumeSpecName: "scripts") pod "b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" (UID: "b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.733062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-kube-api-access-nsdvf" (OuterVolumeSpecName: "kube-api-access-nsdvf") pod "b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" (UID: "b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3"). InnerVolumeSpecName "kube-api-access-nsdvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.754674 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-config-data" (OuterVolumeSpecName: "config-data") pod "b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" (UID: "b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.763104 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" (UID: "b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.800315 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsdvf\" (UniqueName: \"kubernetes.io/projected/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-kube-api-access-nsdvf\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.800644 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.800660 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:42 crc kubenswrapper[4743]: I0310 15:30:42.800673 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.058188 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5869f21f-5fc7-4837-b7e1-688cc16dc3ef","Type":"ContainerStarted","Data":"b7dd4cfe960338f4f94d71437dcd218f6c48ec4ed0d844e06c8e7f225f61594f"} Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.058274 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.062400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gf5kp" event={"ID":"b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3","Type":"ContainerDied","Data":"da43995a8038a65ed17516fc226e0bd51183b8f11f84ca72af5ce0d2c6bdee27"} Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.062433 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gf5kp" Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.062458 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da43995a8038a65ed17516fc226e0bd51183b8f11f84ca72af5ce0d2c6bdee27" Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.123746 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.290111808 podStartE2EDuration="6.123723605s" podCreationTimestamp="2026-03-10 15:30:37 +0000 UTC" firstStartedPulling="2026-03-10 15:30:38.859484015 +0000 UTC m=+1503.566298753" lastFinishedPulling="2026-03-10 15:30:42.693095792 +0000 UTC m=+1507.399910550" observedRunningTime="2026-03-10 15:30:43.085717947 +0000 UTC m=+1507.792532705" watchObservedRunningTime="2026-03-10 15:30:43.123723605 +0000 UTC m=+1507.830538363" Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.256070 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.262144 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.296803 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.340922 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.341497 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="46f01fb8-754a-4367-9a23-e85cdd0e44d5" containerName="nova-scheduler-scheduler" containerID="cri-o://f074304206d1fb638ec82bc4514026d9a9cf8ecc35e20ec48cbb6a855f822f85" gracePeriod=30 Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.352035 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.352299 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-log" containerID="cri-o://6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f" gracePeriod=30 Mar 10 15:30:43 crc kubenswrapper[4743]: I0310 15:30:43.352916 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-metadata" containerID="cri-o://d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a" gracePeriod=30 Mar 10 15:30:44 crc kubenswrapper[4743]: I0310 15:30:44.072870 4743 generic.go:334] "Generic (PLEG): container finished" podID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerID="6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f" exitCode=143 Mar 10 15:30:44 crc kubenswrapper[4743]: I0310 15:30:44.072963 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e33770af-c94d-4a80-97d0-412c0f3605d6","Type":"ContainerDied","Data":"6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f"} Mar 10 15:30:44 crc kubenswrapper[4743]: I0310 15:30:44.073394 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-log" containerID="cri-o://a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb" gracePeriod=30 Mar 10 15:30:44 crc kubenswrapper[4743]: I0310 15:30:44.073489 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-api" containerID="cri-o://faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709" gracePeriod=30 Mar 10 15:30:45 crc kubenswrapper[4743]: I0310 15:30:45.084140 4743 generic.go:334] "Generic (PLEG): container finished" podID="fa232077-e977-4b86-94fa-026a7630c4f2" containerID="a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb" exitCode=143 Mar 10 15:30:45 crc kubenswrapper[4743]: I0310 15:30:45.084327 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa232077-e977-4b86-94fa-026a7630c4f2","Type":"ContainerDied","Data":"a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb"} Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.105311 4743 generic.go:334] "Generic (PLEG): container finished" podID="46f01fb8-754a-4367-9a23-e85cdd0e44d5" containerID="f074304206d1fb638ec82bc4514026d9a9cf8ecc35e20ec48cbb6a855f822f85" exitCode=0 Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.105404 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46f01fb8-754a-4367-9a23-e85cdd0e44d5","Type":"ContainerDied","Data":"f074304206d1fb638ec82bc4514026d9a9cf8ecc35e20ec48cbb6a855f822f85"} Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.441684 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.582569 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-config-data\") pod \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.582938 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-combined-ca-bundle\") pod \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.583004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5glkv\" (UniqueName: \"kubernetes.io/projected/46f01fb8-754a-4367-9a23-e85cdd0e44d5-kube-api-access-5glkv\") pod \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\" (UID: \"46f01fb8-754a-4367-9a23-e85cdd0e44d5\") " Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.603710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f01fb8-754a-4367-9a23-e85cdd0e44d5-kube-api-access-5glkv" (OuterVolumeSpecName: "kube-api-access-5glkv") pod "46f01fb8-754a-4367-9a23-e85cdd0e44d5" (UID: "46f01fb8-754a-4367-9a23-e85cdd0e44d5"). InnerVolumeSpecName "kube-api-access-5glkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.629604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46f01fb8-754a-4367-9a23-e85cdd0e44d5" (UID: "46f01fb8-754a-4367-9a23-e85cdd0e44d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.646523 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-config-data" (OuterVolumeSpecName: "config-data") pod "46f01fb8-754a-4367-9a23-e85cdd0e44d5" (UID: "46f01fb8-754a-4367-9a23-e85cdd0e44d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.685955 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.685994 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5glkv\" (UniqueName: \"kubernetes.io/projected/46f01fb8-754a-4367-9a23-e85cdd0e44d5-kube-api-access-5glkv\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.686009 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f01fb8-754a-4367-9a23-e85cdd0e44d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.881527 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.994074 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-combined-ca-bundle\") pod \"e33770af-c94d-4a80-97d0-412c0f3605d6\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.994206 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4wxw\" (UniqueName: \"kubernetes.io/projected/e33770af-c94d-4a80-97d0-412c0f3605d6-kube-api-access-g4wxw\") pod \"e33770af-c94d-4a80-97d0-412c0f3605d6\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.994262 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e33770af-c94d-4a80-97d0-412c0f3605d6-logs\") pod \"e33770af-c94d-4a80-97d0-412c0f3605d6\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.994286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-nova-metadata-tls-certs\") pod \"e33770af-c94d-4a80-97d0-412c0f3605d6\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.994369 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-config-data\") pod \"e33770af-c94d-4a80-97d0-412c0f3605d6\" (UID: \"e33770af-c94d-4a80-97d0-412c0f3605d6\") " Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.995151 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e33770af-c94d-4a80-97d0-412c0f3605d6-logs" (OuterVolumeSpecName: "logs") pod "e33770af-c94d-4a80-97d0-412c0f3605d6" (UID: "e33770af-c94d-4a80-97d0-412c0f3605d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:46 crc kubenswrapper[4743]: I0310 15:30:46.998238 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e33770af-c94d-4a80-97d0-412c0f3605d6-kube-api-access-g4wxw" (OuterVolumeSpecName: "kube-api-access-g4wxw") pod "e33770af-c94d-4a80-97d0-412c0f3605d6" (UID: "e33770af-c94d-4a80-97d0-412c0f3605d6"). InnerVolumeSpecName "kube-api-access-g4wxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.048781 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e33770af-c94d-4a80-97d0-412c0f3605d6" (UID: "e33770af-c94d-4a80-97d0-412c0f3605d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.051143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-config-data" (OuterVolumeSpecName: "config-data") pod "e33770af-c94d-4a80-97d0-412c0f3605d6" (UID: "e33770af-c94d-4a80-97d0-412c0f3605d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.069800 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e33770af-c94d-4a80-97d0-412c0f3605d6" (UID: "e33770af-c94d-4a80-97d0-412c0f3605d6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.102178 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.102218 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4wxw\" (UniqueName: \"kubernetes.io/projected/e33770af-c94d-4a80-97d0-412c0f3605d6-kube-api-access-g4wxw\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.102235 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e33770af-c94d-4a80-97d0-412c0f3605d6-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.102248 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.102468 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33770af-c94d-4a80-97d0-412c0f3605d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.120746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46f01fb8-754a-4367-9a23-e85cdd0e44d5","Type":"ContainerDied","Data":"53aacd3a40a993386cda42690f6acb05d435291bbbc6022c46dd2d5045447eb2"} Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.120770 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.120797 4743 scope.go:117] "RemoveContainer" containerID="f074304206d1fb638ec82bc4514026d9a9cf8ecc35e20ec48cbb6a855f822f85" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.123215 4743 generic.go:334] "Generic (PLEG): container finished" podID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerID="d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a" exitCode=0 Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.123247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e33770af-c94d-4a80-97d0-412c0f3605d6","Type":"ContainerDied","Data":"d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a"} Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.123270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e33770af-c94d-4a80-97d0-412c0f3605d6","Type":"ContainerDied","Data":"bb78405fc8a436438074c37c1775cdf75a66f9ec1c311298fb037cfb83579329"} Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.123328 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.174443 4743 scope.go:117] "RemoveContainer" containerID="d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.188607 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.213347 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.222211 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.232386 4743 scope.go:117] "RemoveContainer" containerID="6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.233772 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.243379 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:47 crc kubenswrapper[4743]: E0310 15:30:47.243944 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" containerName="nova-manage" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.243968 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" containerName="nova-manage" Mar 10 15:30:47 crc kubenswrapper[4743]: E0310 15:30:47.244002 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-log" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.244012 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-log" Mar 10 15:30:47 crc kubenswrapper[4743]: E0310 15:30:47.244025 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f01fb8-754a-4367-9a23-e85cdd0e44d5" containerName="nova-scheduler-scheduler" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.244033 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f01fb8-754a-4367-9a23-e85cdd0e44d5" containerName="nova-scheduler-scheduler" Mar 10 15:30:47 crc kubenswrapper[4743]: E0310 15:30:47.244075 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-metadata" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.244084 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-metadata" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.244319 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-log" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.244349 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f01fb8-754a-4367-9a23-e85cdd0e44d5" containerName="nova-scheduler-scheduler" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.244371 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" containerName="nova-metadata-metadata" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.244383 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" containerName="nova-manage" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.245260 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.247414 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.259176 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.267937 4743 scope.go:117] "RemoveContainer" containerID="d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a" Mar 10 15:30:47 crc kubenswrapper[4743]: E0310 15:30:47.269282 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a\": container with ID starting with d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a not found: ID does not exist" containerID="d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.269381 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a"} err="failed to get container status \"d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a\": rpc error: code = NotFound desc = could not find container \"d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a\": container with ID starting with d30373020fc60d1fec3874d2adac3e6e8e41964bb5595455d2534d856118ac9a not found: ID does not exist" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.269465 4743 scope.go:117] "RemoveContainer" containerID="6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f" Mar 10 15:30:47 crc kubenswrapper[4743]: E0310 15:30:47.269955 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f\": container with ID starting with 6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f not found: ID does not exist" containerID="6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.269987 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f"} err="failed to get container status \"6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f\": rpc error: code = NotFound desc = could not find container \"6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f\": container with ID starting with 6d8c6499e38caaad7c2d16be4296afb4f3ad3344194487b142183f7f216fc62f not found: ID does not exist" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.272155 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.274165 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.276957 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.277010 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.281799 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.408036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.408438 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvsk\" (UniqueName: \"kubernetes.io/projected/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-kube-api-access-8rvsk\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.408487 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m7b2\" (UniqueName: \"kubernetes.io/projected/5d521c59-429a-4612-b6f4-fbc32204a748-kube-api-access-2m7b2\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.408538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-logs\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.408593 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-config-data\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.408617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.408636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d521c59-429a-4612-b6f4-fbc32204a748-config-data\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.408656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d521c59-429a-4612-b6f4-fbc32204a748-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.510620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.511051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvsk\" (UniqueName: \"kubernetes.io/projected/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-kube-api-access-8rvsk\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.511264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m7b2\" (UniqueName: \"kubernetes.io/projected/5d521c59-429a-4612-b6f4-fbc32204a748-kube-api-access-2m7b2\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.511511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-logs\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.511702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-config-data\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.511906 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.512042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d521c59-429a-4612-b6f4-fbc32204a748-config-data\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.512050 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-logs\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.512211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d521c59-429a-4612-b6f4-fbc32204a748-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.516465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d521c59-429a-4612-b6f4-fbc32204a748-config-data\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.516490 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.517299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-config-data\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.517656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.524415 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d521c59-429a-4612-b6f4-fbc32204a748-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.527843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m7b2\" (UniqueName: \"kubernetes.io/projected/5d521c59-429a-4612-b6f4-fbc32204a748-kube-api-access-2m7b2\") pod \"nova-scheduler-0\" (UID: \"5d521c59-429a-4612-b6f4-fbc32204a748\") " pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.544207 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvsk\" (UniqueName: \"kubernetes.io/projected/608e6dfb-b1d2-4f61-93c1-28aa07052a5f-kube-api-access-8rvsk\") pod \"nova-metadata-0\" (UID: \"608e6dfb-b1d2-4f61-93c1-28aa07052a5f\") " pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.560926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.590672 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.928684 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f01fb8-754a-4367-9a23-e85cdd0e44d5" path="/var/lib/kubelet/pods/46f01fb8-754a-4367-9a23-e85cdd0e44d5/volumes" Mar 10 15:30:47 crc kubenswrapper[4743]: I0310 15:30:47.929690 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e33770af-c94d-4a80-97d0-412c0f3605d6" path="/var/lib/kubelet/pods/e33770af-c94d-4a80-97d0-412c0f3605d6/volumes" Mar 10 15:30:48 crc kubenswrapper[4743]: I0310 15:30:48.068282 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:30:48 crc kubenswrapper[4743]: I0310 15:30:48.137724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5d521c59-429a-4612-b6f4-fbc32204a748","Type":"ContainerStarted","Data":"c2ac17f2f9723decf95ee7b59206fe2aff085ac4c76f0a79260a90fbf570dc56"} Mar 10 15:30:48 crc kubenswrapper[4743]: W0310 15:30:48.144771 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod608e6dfb_b1d2_4f61_93c1_28aa07052a5f.slice/crio-048a230e28ab6f93386226fc42c57185b07e0dc4492f2c7d5917593b5093968d WatchSource:0}: Error finding container 048a230e28ab6f93386226fc42c57185b07e0dc4492f2c7d5917593b5093968d: Status 404 returned error can't find the container with id 048a230e28ab6f93386226fc42c57185b07e0dc4492f2c7d5917593b5093968d Mar 10 15:30:48 crc kubenswrapper[4743]: I0310 15:30:48.146022 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:30:48 crc kubenswrapper[4743]: I0310 15:30:48.862225 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.048206 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-public-tls-certs\") pod \"fa232077-e977-4b86-94fa-026a7630c4f2\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.048620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-combined-ca-bundle\") pod \"fa232077-e977-4b86-94fa-026a7630c4f2\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.048828 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-config-data\") pod \"fa232077-e977-4b86-94fa-026a7630c4f2\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.048898 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa232077-e977-4b86-94fa-026a7630c4f2-logs\") pod \"fa232077-e977-4b86-94fa-026a7630c4f2\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.049000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-internal-tls-certs\") pod \"fa232077-e977-4b86-94fa-026a7630c4f2\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.049068 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klkmk\" (UniqueName: \"kubernetes.io/projected/fa232077-e977-4b86-94fa-026a7630c4f2-kube-api-access-klkmk\") pod \"fa232077-e977-4b86-94fa-026a7630c4f2\" (UID: \"fa232077-e977-4b86-94fa-026a7630c4f2\") " Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.049688 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa232077-e977-4b86-94fa-026a7630c4f2-logs" (OuterVolumeSpecName: "logs") pod "fa232077-e977-4b86-94fa-026a7630c4f2" (UID: "fa232077-e977-4b86-94fa-026a7630c4f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.050242 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa232077-e977-4b86-94fa-026a7630c4f2-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.064375 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa232077-e977-4b86-94fa-026a7630c4f2-kube-api-access-klkmk" (OuterVolumeSpecName: "kube-api-access-klkmk") pod "fa232077-e977-4b86-94fa-026a7630c4f2" (UID: "fa232077-e977-4b86-94fa-026a7630c4f2"). InnerVolumeSpecName "kube-api-access-klkmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.076410 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-config-data" (OuterVolumeSpecName: "config-data") pod "fa232077-e977-4b86-94fa-026a7630c4f2" (UID: "fa232077-e977-4b86-94fa-026a7630c4f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.078898 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa232077-e977-4b86-94fa-026a7630c4f2" (UID: "fa232077-e977-4b86-94fa-026a7630c4f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.102638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fa232077-e977-4b86-94fa-026a7630c4f2" (UID: "fa232077-e977-4b86-94fa-026a7630c4f2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.104712 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fa232077-e977-4b86-94fa-026a7630c4f2" (UID: "fa232077-e977-4b86-94fa-026a7630c4f2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.151882 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.151921 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.151937 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klkmk\" (UniqueName: \"kubernetes.io/projected/fa232077-e977-4b86-94fa-026a7630c4f2-kube-api-access-klkmk\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.151953 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.151968 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa232077-e977-4b86-94fa-026a7630c4f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.157783 4743 generic.go:334] "Generic (PLEG): container finished" podID="fa232077-e977-4b86-94fa-026a7630c4f2" containerID="faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709" exitCode=0 Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.157874 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa232077-e977-4b86-94fa-026a7630c4f2","Type":"ContainerDied","Data":"faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709"} Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.157896 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.157940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa232077-e977-4b86-94fa-026a7630c4f2","Type":"ContainerDied","Data":"2e118909daddfd015532110a3b7596fd82d71c4751b38e57c23781336455acfc"} Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.157967 4743 scope.go:117] "RemoveContainer" containerID="faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.163938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"608e6dfb-b1d2-4f61-93c1-28aa07052a5f","Type":"ContainerStarted","Data":"ab8a728ccaecdf5b971d1ca5db1d3b00483982e4397e5e9443e701b419521676"} Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.164018 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"608e6dfb-b1d2-4f61-93c1-28aa07052a5f","Type":"ContainerStarted","Data":"0f19b26f90f8a62c43b5c4dedbd3f45c46287378ad1fa53336e212d561fda399"} Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.164057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"608e6dfb-b1d2-4f61-93c1-28aa07052a5f","Type":"ContainerStarted","Data":"048a230e28ab6f93386226fc42c57185b07e0dc4492f2c7d5917593b5093968d"} Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.166737 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5d521c59-429a-4612-b6f4-fbc32204a748","Type":"ContainerStarted","Data":"761fbe56c2f77171beae20bc983d120625cf22609865751773db0e1b7f20bf92"} Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.196217 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.196192106 podStartE2EDuration="2.196192106s" podCreationTimestamp="2026-03-10 15:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:49.188425654 +0000 UTC m=+1513.895240402" watchObservedRunningTime="2026-03-10 15:30:49.196192106 +0000 UTC m=+1513.903006854" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.206393 4743 scope.go:117] "RemoveContainer" containerID="a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.219717 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.230436 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.233404 4743 scope.go:117] "RemoveContainer" containerID="faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709" Mar 10 15:30:49 crc kubenswrapper[4743]: E0310 15:30:49.234061 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709\": container with ID starting with faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709 not found: ID does not exist" containerID="faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.234119 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709"} err="failed to get container status \"faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709\": rpc error: code = NotFound desc = could not find container \"faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709\": container with ID starting with faf9ef5c8f1fd8e4e4a8c6c7777d5d9476d0b2c994a65ffe1dcf054f65871709 not found: ID does not exist" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.234154 4743 scope.go:117] "RemoveContainer" containerID="a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb" Mar 10 15:30:49 crc kubenswrapper[4743]: E0310 15:30:49.234888 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb\": container with ID starting with a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb not found: ID does not exist" containerID="a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.234921 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb"} err="failed to get container status \"a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb\": rpc error: code = NotFound desc = could not find container \"a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb\": container with ID starting with a32c96b8ddc7214b97586ae32f9ef160d60fb6b943c9cd4127c410e6735aa9eb not found: ID does not exist" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.240872 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:49 crc kubenswrapper[4743]: E0310 15:30:49.241363 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-log" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.241378 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-log" Mar 10 15:30:49 crc kubenswrapper[4743]: E0310 15:30:49.241388 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-api" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.241394 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-api" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.241559 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-api" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.241580 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" containerName="nova-api-log" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.242749 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.247985 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.248359 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.249215 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.258577 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.258554921 podStartE2EDuration="2.258554921s" podCreationTimestamp="2026-03-10 15:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:49.224107695 +0000 UTC m=+1513.930922443" watchObservedRunningTime="2026-03-10 15:30:49.258554921 +0000 UTC m=+1513.965369669" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.275734 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.355028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgd6s\" (UniqueName: \"kubernetes.io/projected/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-kube-api-access-qgd6s\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.355097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-internal-tls-certs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.355178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-logs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.355220 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.355290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-public-tls-certs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.355326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-config-data\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.457842 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-logs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.458024 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.458230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-public-tls-certs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.458262 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-logs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.458391 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-config-data\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.458475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgd6s\" (UniqueName: \"kubernetes.io/projected/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-kube-api-access-qgd6s\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.458595 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-internal-tls-certs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.462750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-internal-tls-certs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.463093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-public-tls-certs\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.472747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.475536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-config-data\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.475850 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgd6s\" (UniqueName: \"kubernetes.io/projected/476447c9-b5d8-4f7d-a6a7-f1bff8302ced-kube-api-access-qgd6s\") pod \"nova-api-0\" (UID: \"476447c9-b5d8-4f7d-a6a7-f1bff8302ced\") " pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.562545 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:30:49 crc kubenswrapper[4743]: I0310 15:30:49.929200 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa232077-e977-4b86-94fa-026a7630c4f2" path="/var/lib/kubelet/pods/fa232077-e977-4b86-94fa-026a7630c4f2/volumes" Mar 10 15:30:50 crc kubenswrapper[4743]: I0310 15:30:50.066669 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:30:50 crc kubenswrapper[4743]: I0310 15:30:50.181320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"476447c9-b5d8-4f7d-a6a7-f1bff8302ced","Type":"ContainerStarted","Data":"8aa04a2326a3cb3ed77b66727e52236834eed93d57521e73afc89f5c5c70e6a7"} Mar 10 15:30:51 crc kubenswrapper[4743]: I0310 15:30:51.199110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"476447c9-b5d8-4f7d-a6a7-f1bff8302ced","Type":"ContainerStarted","Data":"23ea17a799c69f802234ed452e5ead98e0ceb07c75b10a1ef0f66df692803398"} Mar 10 15:30:51 crc kubenswrapper[4743]: I0310 15:30:51.199847 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"476447c9-b5d8-4f7d-a6a7-f1bff8302ced","Type":"ContainerStarted","Data":"52443884cb7c99ef14bd4a65952846bfc0da84590d74ab1bb6117f231d14add0"} Mar 10 15:30:51 crc kubenswrapper[4743]: I0310 15:30:51.229983 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.229964433 podStartE2EDuration="2.229964433s" podCreationTimestamp="2026-03-10 15:30:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:51.226675279 +0000 UTC m=+1515.933490027" watchObservedRunningTime="2026-03-10 15:30:51.229964433 +0000 UTC m=+1515.936779181" Mar 10 15:30:51 crc kubenswrapper[4743]: I0310 15:30:51.726177 4743 scope.go:117] "RemoveContainer" containerID="bc6875416e2bf56649ef481dec269bb9679823c362f822b0f47a0d00adee58a8" Mar 10 15:30:52 crc kubenswrapper[4743]: I0310 15:30:52.562406 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 15:30:52 crc kubenswrapper[4743]: I0310 15:30:52.591791 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:30:52 crc kubenswrapper[4743]: I0310 15:30:52.591865 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:30:57 crc kubenswrapper[4743]: I0310 15:30:57.562246 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 15:30:57 crc kubenswrapper[4743]: I0310 15:30:57.590737 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 15:30:57 crc kubenswrapper[4743]: I0310 15:30:57.591169 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 15:30:57 crc kubenswrapper[4743]: I0310 15:30:57.591226 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 15:30:58 crc kubenswrapper[4743]: I0310 15:30:58.326811 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 15:30:58 crc kubenswrapper[4743]: I0310 15:30:58.608115 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="608e6dfb-b1d2-4f61-93c1-28aa07052a5f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.235:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:30:58 crc kubenswrapper[4743]: I0310 15:30:58.608126 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="608e6dfb-b1d2-4f61-93c1-28aa07052a5f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.235:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:30:59 crc kubenswrapper[4743]: I0310 15:30:59.563011 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:30:59 crc kubenswrapper[4743]: I0310 15:30:59.563100 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:31:00 crc kubenswrapper[4743]: I0310 15:31:00.575060 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="476447c9-b5d8-4f7d-a6a7-f1bff8302ced" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.236:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:31:00 crc kubenswrapper[4743]: I0310 15:31:00.575124 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="476447c9-b5d8-4f7d-a6a7-f1bff8302ced" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.236:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:31:07 crc kubenswrapper[4743]: I0310 15:31:07.603313 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 15:31:07 crc kubenswrapper[4743]: I0310 15:31:07.604176 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 15:31:07 crc kubenswrapper[4743]: I0310 15:31:07.613697 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 15:31:07 crc kubenswrapper[4743]: I0310 15:31:07.616014 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 15:31:08 crc kubenswrapper[4743]: I0310 15:31:08.394260 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 15:31:09 crc kubenswrapper[4743]: I0310 15:31:09.568200 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 15:31:09 crc kubenswrapper[4743]: I0310 15:31:09.568738 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 15:31:09 crc kubenswrapper[4743]: I0310 15:31:09.573414 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 15:31:09 crc kubenswrapper[4743]: I0310 15:31:09.574775 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 15:31:10 crc kubenswrapper[4743]: I0310 15:31:10.450247 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 15:31:10 crc kubenswrapper[4743]: I0310 15:31:10.457577 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 15:31:18 crc kubenswrapper[4743]: I0310 15:31:18.005008 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:31:18 crc kubenswrapper[4743]: I0310 15:31:18.815922 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:31:22 crc kubenswrapper[4743]: I0310 15:31:22.419361 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerName="rabbitmq" containerID="cri-o://07e58d41414928253bfac6c3afe501ba6c33a78dd94b06a2b744a115e761b46c" gracePeriod=604796 Mar 10 15:31:23 crc kubenswrapper[4743]: I0310 15:31:23.463019 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerName="rabbitmq" containerID="cri-o://039279fbc32f9a50c4bfda5f3da5c7d4eb8a1582a58c193b5befb912702ef757" gracePeriod=604796 Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.692924 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v47pg"] Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.695914 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.706619 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v47pg"] Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.769925 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-utilities\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.770354 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92x57\" (UniqueName: \"kubernetes.io/projected/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-kube-api-access-92x57\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.770591 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-catalog-content\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.872516 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-catalog-content\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.872705 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-utilities\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.872843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92x57\" (UniqueName: \"kubernetes.io/projected/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-kube-api-access-92x57\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.873035 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-catalog-content\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.873436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-utilities\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.902263 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92x57\" (UniqueName: \"kubernetes.io/projected/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-kube-api-access-92x57\") pod \"redhat-marketplace-v47pg\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:24 crc kubenswrapper[4743]: I0310 15:31:24.983768 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 10 15:31:25 crc kubenswrapper[4743]: I0310 15:31:25.020734 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:25 crc kubenswrapper[4743]: I0310 15:31:25.441730 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 10 15:31:25 crc kubenswrapper[4743]: I0310 15:31:25.560732 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v47pg"] Mar 10 15:31:25 crc kubenswrapper[4743]: I0310 15:31:25.628529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v47pg" event={"ID":"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974","Type":"ContainerStarted","Data":"580591dc0b9750db702fcc90cb806b8b5a279bd0b84272374a1cddf616f25b5c"} Mar 10 15:31:26 crc kubenswrapper[4743]: I0310 15:31:26.643328 4743 generic.go:334] "Generic (PLEG): container finished" podID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerID="58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7" exitCode=0 Mar 10 15:31:26 crc kubenswrapper[4743]: I0310 15:31:26.643567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v47pg" event={"ID":"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974","Type":"ContainerDied","Data":"58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7"} Mar 10 15:31:26 crc kubenswrapper[4743]: I0310 15:31:26.648084 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:31:27 crc kubenswrapper[4743]: I0310 15:31:27.658613 4743 generic.go:334] "Generic (PLEG): container finished" podID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerID="ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5" exitCode=0 Mar 10 15:31:27 crc kubenswrapper[4743]: I0310 15:31:27.658665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v47pg" event={"ID":"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974","Type":"ContainerDied","Data":"ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5"} Mar 10 15:31:28 crc kubenswrapper[4743]: I0310 15:31:28.676380 4743 generic.go:334] "Generic (PLEG): container finished" podID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerID="07e58d41414928253bfac6c3afe501ba6c33a78dd94b06a2b744a115e761b46c" exitCode=0 Mar 10 15:31:28 crc kubenswrapper[4743]: I0310 15:31:28.676707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7541c4b7-eda5-4cd5-b0c4-c00621726c2b","Type":"ContainerDied","Data":"07e58d41414928253bfac6c3afe501ba6c33a78dd94b06a2b744a115e761b46c"} Mar 10 15:31:28 crc kubenswrapper[4743]: I0310 15:31:28.683191 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v47pg" event={"ID":"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974","Type":"ContainerStarted","Data":"f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e"} Mar 10 15:31:28 crc kubenswrapper[4743]: I0310 15:31:28.712614 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v47pg" podStartSLOduration=3.200267973 podStartE2EDuration="4.712591407s" podCreationTimestamp="2026-03-10 15:31:24 +0000 UTC" firstStartedPulling="2026-03-10 15:31:26.647674301 +0000 UTC m=+1551.354489069" lastFinishedPulling="2026-03-10 15:31:28.159997755 +0000 UTC m=+1552.866812503" observedRunningTime="2026-03-10 15:31:28.704966369 +0000 UTC m=+1553.411781127" watchObservedRunningTime="2026-03-10 15:31:28.712591407 +0000 UTC m=+1553.419406165" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.096573 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.175902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-confd\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.175988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-pod-info\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176067 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-tls\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176141 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-erlang-cookie-secret\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176198 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-plugins-conf\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176238 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-server-conf\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvxh2\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-kube-api-access-jvxh2\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-erlang-cookie\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176406 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-config-data\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176431 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.176509 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-plugins\") pod \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\" (UID: \"7541c4b7-eda5-4cd5-b0c4-c00621726c2b\") " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.181179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.183595 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.188314 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.200031 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.202126 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.208384 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-kube-api-access-jvxh2" (OuterVolumeSpecName: "kube-api-access-jvxh2") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "kube-api-access-jvxh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.211229 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-pod-info" (OuterVolumeSpecName: "pod-info") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.211346 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.249123 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-config-data" (OuterVolumeSpecName: "config-data") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279391 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279421 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279429 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279437 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279446 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279458 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvxh2\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-kube-api-access-jvxh2\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279468 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279477 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.279498 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.303352 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-server-conf" (OuterVolumeSpecName: "server-conf") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.342849 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.358025 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7541c4b7-eda5-4cd5-b0c4-c00621726c2b" (UID: "7541c4b7-eda5-4cd5-b0c4-c00621726c2b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.383286 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.384013 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.384042 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7541c4b7-eda5-4cd5-b0c4-c00621726c2b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.696967 4743 generic.go:334] "Generic (PLEG): container finished" podID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerID="039279fbc32f9a50c4bfda5f3da5c7d4eb8a1582a58c193b5befb912702ef757" exitCode=0 Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.697059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"884d5267-4e85-481c-96f0-eb31b88bfe67","Type":"ContainerDied","Data":"039279fbc32f9a50c4bfda5f3da5c7d4eb8a1582a58c193b5befb912702ef757"} Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.701548 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.702342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7541c4b7-eda5-4cd5-b0c4-c00621726c2b","Type":"ContainerDied","Data":"328f1248558b679ac72e90fe9427ed4f1003f7a8cbed165974c8d9973c5130e8"} Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.702437 4743 scope.go:117] "RemoveContainer" containerID="07e58d41414928253bfac6c3afe501ba6c33a78dd94b06a2b744a115e761b46c" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.778155 4743 scope.go:117] "RemoveContainer" containerID="290b8969232645596b41011b8e17a74e528a28bee07b274203a8b937c2e4b355" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.795772 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.805323 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.833828 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:31:29 crc kubenswrapper[4743]: E0310 15:31:29.834355 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerName="rabbitmq" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.834371 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerName="rabbitmq" Mar 10 15:31:29 crc kubenswrapper[4743]: E0310 15:31:29.834391 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerName="setup-container" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.834401 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerName="setup-container" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.834631 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" containerName="rabbitmq" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.835747 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.842486 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.842757 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.843500 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.844410 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.844619 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.844946 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.849626 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-blfbw" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.867326 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.895721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.895765 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.895895 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.895943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.895966 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-config-data\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.895983 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.896001 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e44585ef-8ab2-45e9-a4f3-f333629f433a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.896019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e44585ef-8ab2-45e9-a4f3-f333629f433a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.896090 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.896120 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.896144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvjpn\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-kube-api-access-lvjpn\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.930937 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7541c4b7-eda5-4cd5-b0c4-c00621726c2b" path="/var/lib/kubelet/pods/7541c4b7-eda5-4cd5-b0c4-c00621726c2b/volumes" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998290 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvjpn\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-kube-api-access-lvjpn\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998419 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998514 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998625 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-config-data\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998645 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e44585ef-8ab2-45e9-a4f3-f333629f433a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.998717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e44585ef-8ab2-45e9-a4f3-f333629f433a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:29 crc kubenswrapper[4743]: I0310 15:31:29.999368 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.001633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-config-data\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.001667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.002461 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.002522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e44585ef-8ab2-45e9-a4f3-f333629f433a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.006166 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.007978 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.009405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.016560 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e44585ef-8ab2-45e9-a4f3-f333629f433a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.023800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvjpn\" (UniqueName: \"kubernetes.io/projected/e44585ef-8ab2-45e9-a4f3-f333629f433a-kube-api-access-lvjpn\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.033649 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e44585ef-8ab2-45e9-a4f3-f333629f433a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.072850 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e44585ef-8ab2-45e9-a4f3-f333629f433a\") " pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.152386 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.211517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-config-data\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.211790 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-erlang-cookie\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.211824 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/884d5267-4e85-481c-96f0-eb31b88bfe67-pod-info\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.211873 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-plugins\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.211904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-plugins-conf\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.211944 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvdg2\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-kube-api-access-kvdg2\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.211992 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-tls\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.212093 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-server-conf\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.212186 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/884d5267-4e85-481c-96f0-eb31b88bfe67-erlang-cookie-secret\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.212984 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.213088 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-confd\") pod \"884d5267-4e85-481c-96f0-eb31b88bfe67\" (UID: \"884d5267-4e85-481c-96f0-eb31b88bfe67\") " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.213988 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.215835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.219086 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-kube-api-access-kvdg2" (OuterVolumeSpecName: "kube-api-access-kvdg2") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "kube-api-access-kvdg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.219283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.221459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.223449 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.224193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.233019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/884d5267-4e85-481c-96f0-eb31b88bfe67-pod-info" (OuterVolumeSpecName: "pod-info") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.243300 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884d5267-4e85-481c-96f0-eb31b88bfe67-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.257104 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-config-data" (OuterVolumeSpecName: "config-data") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316432 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/884d5267-4e85-481c-96f0-eb31b88bfe67-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316490 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316504 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316519 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316534 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/884d5267-4e85-481c-96f0-eb31b88bfe67-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316547 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316559 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316570 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvdg2\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-kube-api-access-kvdg2\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.316584 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.346333 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-server-conf" (OuterVolumeSpecName: "server-conf") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.346698 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.401767 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "884d5267-4e85-481c-96f0-eb31b88bfe67" (UID: "884d5267-4e85-481c-96f0-eb31b88bfe67"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.418218 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/884d5267-4e85-481c-96f0-eb31b88bfe67-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.418251 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.418261 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/884d5267-4e85-481c-96f0-eb31b88bfe67-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.627727 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.730915 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"884d5267-4e85-481c-96f0-eb31b88bfe67","Type":"ContainerDied","Data":"68c784d8b3a23ea6a169548af32af6550eaa28f21ef52396dc1225b0fde88700"} Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.730971 4743 scope.go:117] "RemoveContainer" containerID="039279fbc32f9a50c4bfda5f3da5c7d4eb8a1582a58c193b5befb912702ef757" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.731082 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.750004 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44585ef-8ab2-45e9-a4f3-f333629f433a","Type":"ContainerStarted","Data":"2b83ed0fcb7be0cdb2b788a23d7542ce4c997825b213a5594467ea899bba9dd9"} Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.810129 4743 scope.go:117] "RemoveContainer" containerID="642a6c5de3968f103baa5bbf0937b99b8102f35959c7bfcd09d2ddb3d9449e34" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.864720 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.878023 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.903457 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:31:30 crc kubenswrapper[4743]: E0310 15:31:30.904012 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerName="setup-container" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.904038 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerName="setup-container" Mar 10 15:31:30 crc kubenswrapper[4743]: E0310 15:31:30.904097 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerName="rabbitmq" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.904108 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerName="rabbitmq" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.904384 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="884d5267-4e85-481c-96f0-eb31b88bfe67" containerName="rabbitmq" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.905741 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.909091 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.909419 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-bbrgl" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.909607 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.909728 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.909879 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.909991 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.910512 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 15:31:30 crc kubenswrapper[4743]: I0310 15:31:30.961736 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.015491 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-696df444c7-rmbjj"] Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.017119 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.019962 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.040056 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-rmbjj"] Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.053400 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwdz5\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-kube-api-access-cwdz5\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.053624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.053672 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.053836 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.053918 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.054000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.054177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.054203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.054287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b741008c-73ba-4516-bb63-05b066d7051b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.055251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b741008c-73ba-4516-bb63-05b066d7051b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.055384 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.157759 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-sb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.157839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b741008c-73ba-4516-bb63-05b066d7051b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.157872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.157929 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwdz5\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-kube-api-access-cwdz5\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.157953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-nb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.157979 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-svc\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158175 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b4wb\" (UniqueName: \"kubernetes.io/projected/1d461cfd-9fd1-4674-97d2-65c42611c1d1-kube-api-access-4b4wb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158209 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158373 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-openstack-edpm-ipam\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-config\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158799 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.158844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.159060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-swift-storage-0\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.159124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b741008c-73ba-4516-bb63-05b066d7051b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.159621 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.159692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.159742 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b741008c-73ba-4516-bb63-05b066d7051b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.162135 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b741008c-73ba-4516-bb63-05b066d7051b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.165444 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.170586 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.180160 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b741008c-73ba-4516-bb63-05b066d7051b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.181555 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwdz5\" (UniqueName: \"kubernetes.io/projected/b741008c-73ba-4516-bb63-05b066d7051b-kube-api-access-cwdz5\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.198549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b741008c-73ba-4516-bb63-05b066d7051b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.261512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b4wb\" (UniqueName: \"kubernetes.io/projected/1d461cfd-9fd1-4674-97d2-65c42611c1d1-kube-api-access-4b4wb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.261686 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-openstack-edpm-ipam\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.261716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-config\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.261759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-swift-storage-0\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.261860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-sb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.261953 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-nb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.261993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-svc\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.262925 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-swift-storage-0\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.263425 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-openstack-edpm-ipam\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.264936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-config\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.265075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-svc\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.265266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-sb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.265697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-nb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.281354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b4wb\" (UniqueName: \"kubernetes.io/projected/1d461cfd-9fd1-4674-97d2-65c42611c1d1-kube-api-access-4b4wb\") pod \"dnsmasq-dns-696df444c7-rmbjj\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.295710 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.345204 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.779380 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.842741 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-rmbjj"] Mar 10 15:31:31 crc kubenswrapper[4743]: W0310 15:31:31.864866 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb741008c_73ba_4516_bb63_05b066d7051b.slice/crio-612c4ecf3fc344fa4af06fd0e949339dbad07089ab27716e9ca4c34865007e4b WatchSource:0}: Error finding container 612c4ecf3fc344fa4af06fd0e949339dbad07089ab27716e9ca4c34865007e4b: Status 404 returned error can't find the container with id 612c4ecf3fc344fa4af06fd0e949339dbad07089ab27716e9ca4c34865007e4b Mar 10 15:31:31 crc kubenswrapper[4743]: W0310 15:31:31.872016 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d461cfd_9fd1_4674_97d2_65c42611c1d1.slice/crio-a592dc35e263fa6beb325eff397df6a575c1ca19bd0f7cb81ac9d96694716714 WatchSource:0}: Error finding container a592dc35e263fa6beb325eff397df6a575c1ca19bd0f7cb81ac9d96694716714: Status 404 returned error can't find the container with id a592dc35e263fa6beb325eff397df6a575c1ca19bd0f7cb81ac9d96694716714 Mar 10 15:31:31 crc kubenswrapper[4743]: I0310 15:31:31.926550 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884d5267-4e85-481c-96f0-eb31b88bfe67" path="/var/lib/kubelet/pods/884d5267-4e85-481c-96f0-eb31b88bfe67/volumes" Mar 10 15:31:32 crc kubenswrapper[4743]: I0310 15:31:32.774840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44585ef-8ab2-45e9-a4f3-f333629f433a","Type":"ContainerStarted","Data":"632a6bcd5d794e702c3680c538d3d3b6f0b309ba2bbd73f7f956b4754a37f125"} Mar 10 15:31:32 crc kubenswrapper[4743]: I0310 15:31:32.777175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b741008c-73ba-4516-bb63-05b066d7051b","Type":"ContainerStarted","Data":"612c4ecf3fc344fa4af06fd0e949339dbad07089ab27716e9ca4c34865007e4b"} Mar 10 15:31:32 crc kubenswrapper[4743]: I0310 15:31:32.779597 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" containerID="37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a" exitCode=0 Mar 10 15:31:32 crc kubenswrapper[4743]: I0310 15:31:32.779651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" event={"ID":"1d461cfd-9fd1-4674-97d2-65c42611c1d1","Type":"ContainerDied","Data":"37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a"} Mar 10 15:31:32 crc kubenswrapper[4743]: I0310 15:31:32.779683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" event={"ID":"1d461cfd-9fd1-4674-97d2-65c42611c1d1","Type":"ContainerStarted","Data":"a592dc35e263fa6beb325eff397df6a575c1ca19bd0f7cb81ac9d96694716714"} Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.183764 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7zf95"] Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.186715 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.207449 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7zf95"] Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.316726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd7zh\" (UniqueName: \"kubernetes.io/projected/2d1204a0-99da-43e9-8060-00a1e906f044-kube-api-access-hd7zh\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.316869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-utilities\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.317007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-catalog-content\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.418677 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd7zh\" (UniqueName: \"kubernetes.io/projected/2d1204a0-99da-43e9-8060-00a1e906f044-kube-api-access-hd7zh\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.418775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-utilities\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.418930 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-catalog-content\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.419393 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-catalog-content\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.419639 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-utilities\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.455942 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd7zh\" (UniqueName: \"kubernetes.io/projected/2d1204a0-99da-43e9-8060-00a1e906f044-kube-api-access-hd7zh\") pod \"redhat-operators-7zf95\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.506503 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.803603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b741008c-73ba-4516-bb63-05b066d7051b","Type":"ContainerStarted","Data":"6a1796b823d3d0a1c3653618ece41e830dddf20f307893cb2e75361e6bb719ed"} Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.832062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" event={"ID":"1d461cfd-9fd1-4674-97d2-65c42611c1d1","Type":"ContainerStarted","Data":"ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c"} Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.832219 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:33 crc kubenswrapper[4743]: I0310 15:31:33.931838 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" podStartSLOduration=3.931798314 podStartE2EDuration="3.931798314s" podCreationTimestamp="2026-03-10 15:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:31:33.899640803 +0000 UTC m=+1558.606455551" watchObservedRunningTime="2026-03-10 15:31:33.931798314 +0000 UTC m=+1558.638613062" Mar 10 15:31:34 crc kubenswrapper[4743]: I0310 15:31:34.049332 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7zf95"] Mar 10 15:31:34 crc kubenswrapper[4743]: I0310 15:31:34.844030 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d1204a0-99da-43e9-8060-00a1e906f044" containerID="fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51" exitCode=0 Mar 10 15:31:34 crc kubenswrapper[4743]: I0310 15:31:34.844106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zf95" event={"ID":"2d1204a0-99da-43e9-8060-00a1e906f044","Type":"ContainerDied","Data":"fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51"} Mar 10 15:31:34 crc kubenswrapper[4743]: I0310 15:31:34.844555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zf95" event={"ID":"2d1204a0-99da-43e9-8060-00a1e906f044","Type":"ContainerStarted","Data":"8308fa50313c034eeb5d12ad32a6afaa5a35d57d89124d960abee3dd665b0bb2"} Mar 10 15:31:35 crc kubenswrapper[4743]: I0310 15:31:35.021250 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:35 crc kubenswrapper[4743]: I0310 15:31:35.021316 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:35 crc kubenswrapper[4743]: I0310 15:31:35.069515 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:35 crc kubenswrapper[4743]: I0310 15:31:35.949297 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:36 crc kubenswrapper[4743]: I0310 15:31:36.869900 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zf95" event={"ID":"2d1204a0-99da-43e9-8060-00a1e906f044","Type":"ContainerStarted","Data":"ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766"} Mar 10 15:31:37 crc kubenswrapper[4743]: I0310 15:31:37.361190 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v47pg"] Mar 10 15:31:37 crc kubenswrapper[4743]: I0310 15:31:37.880874 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v47pg" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerName="registry-server" containerID="cri-o://f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e" gracePeriod=2 Mar 10 15:31:38 crc kubenswrapper[4743]: I0310 15:31:38.891585 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d1204a0-99da-43e9-8060-00a1e906f044" containerID="ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766" exitCode=0 Mar 10 15:31:38 crc kubenswrapper[4743]: I0310 15:31:38.891681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zf95" event={"ID":"2d1204a0-99da-43e9-8060-00a1e906f044","Type":"ContainerDied","Data":"ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766"} Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.445365 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.558417 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-catalog-content\") pod \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.558761 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92x57\" (UniqueName: \"kubernetes.io/projected/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-kube-api-access-92x57\") pod \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.558973 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-utilities\") pod \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\" (UID: \"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974\") " Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.559439 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-utilities" (OuterVolumeSpecName: "utilities") pod "3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" (UID: "3b1c8bdd-3209-49e0-a2cf-c5abe1b83974"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.559608 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.565789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-kube-api-access-92x57" (OuterVolumeSpecName: "kube-api-access-92x57") pod "3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" (UID: "3b1c8bdd-3209-49e0-a2cf-c5abe1b83974"). InnerVolumeSpecName "kube-api-access-92x57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.581705 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" (UID: "3b1c8bdd-3209-49e0-a2cf-c5abe1b83974"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.661225 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.661265 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92x57\" (UniqueName: \"kubernetes.io/projected/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974-kube-api-access-92x57\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.908327 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zf95" event={"ID":"2d1204a0-99da-43e9-8060-00a1e906f044","Type":"ContainerStarted","Data":"abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5"} Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.912279 4743 generic.go:334] "Generic (PLEG): container finished" podID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerID="f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e" exitCode=0 Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.912329 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v47pg" event={"ID":"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974","Type":"ContainerDied","Data":"f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e"} Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.912356 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v47pg" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.912371 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v47pg" event={"ID":"3b1c8bdd-3209-49e0-a2cf-c5abe1b83974","Type":"ContainerDied","Data":"580591dc0b9750db702fcc90cb806b8b5a279bd0b84272374a1cddf616f25b5c"} Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.912388 4743 scope.go:117] "RemoveContainer" containerID="f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.941423 4743 scope.go:117] "RemoveContainer" containerID="ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.953203 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7zf95" podStartSLOduration=2.393496169 podStartE2EDuration="6.953183284s" podCreationTimestamp="2026-03-10 15:31:33 +0000 UTC" firstStartedPulling="2026-03-10 15:31:34.846032963 +0000 UTC m=+1559.552847711" lastFinishedPulling="2026-03-10 15:31:39.405720068 +0000 UTC m=+1564.112534826" observedRunningTime="2026-03-10 15:31:39.934278023 +0000 UTC m=+1564.641092811" watchObservedRunningTime="2026-03-10 15:31:39.953183284 +0000 UTC m=+1564.659998032" Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.976735 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v47pg"] Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.986308 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v47pg"] Mar 10 15:31:39 crc kubenswrapper[4743]: I0310 15:31:39.988115 4743 scope.go:117] "RemoveContainer" containerID="58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7" Mar 10 15:31:40 crc kubenswrapper[4743]: I0310 15:31:40.028962 4743 scope.go:117] "RemoveContainer" containerID="f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e" Mar 10 15:31:40 crc kubenswrapper[4743]: E0310 15:31:40.029540 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e\": container with ID starting with f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e not found: ID does not exist" containerID="f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e" Mar 10 15:31:40 crc kubenswrapper[4743]: I0310 15:31:40.029573 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e"} err="failed to get container status \"f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e\": rpc error: code = NotFound desc = could not find container \"f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e\": container with ID starting with f7b02d8b3b63a6dcf71bb4305e8e3844f14024c74b81752aea831ac454ede91e not found: ID does not exist" Mar 10 15:31:40 crc kubenswrapper[4743]: I0310 15:31:40.029595 4743 scope.go:117] "RemoveContainer" containerID="ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5" Mar 10 15:31:40 crc kubenswrapper[4743]: E0310 15:31:40.030013 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5\": container with ID starting with ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5 not found: ID does not exist" containerID="ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5" Mar 10 15:31:40 crc kubenswrapper[4743]: I0310 15:31:40.030040 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5"} err="failed to get container status \"ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5\": rpc error: code = NotFound desc = could not find container \"ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5\": container with ID starting with ebb8bf9c9d1deedc5f5a6780c85c5d9035a3f1cb9cca70bf904f7342bbdf89c5 not found: ID does not exist" Mar 10 15:31:40 crc kubenswrapper[4743]: I0310 15:31:40.030055 4743 scope.go:117] "RemoveContainer" containerID="58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7" Mar 10 15:31:40 crc kubenswrapper[4743]: E0310 15:31:40.030327 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7\": container with ID starting with 58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7 not found: ID does not exist" containerID="58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7" Mar 10 15:31:40 crc kubenswrapper[4743]: I0310 15:31:40.030349 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7"} err="failed to get container status \"58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7\": rpc error: code = NotFound desc = could not find container \"58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7\": container with ID starting with 58aebb8041e721514c8dac501e38310408b4358bbd142741131545a306057fb7 not found: ID does not exist" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.252488 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.254371 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.347060 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.431468 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-4297n"] Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.431788 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" podUID="17052016-cb68-4e74-82e3-05531596e17e" containerName="dnsmasq-dns" containerID="cri-o://01d944739ade9c3156cc5e5e277939e2b77016237037ffdde1f2ecc1ceb56bef" gracePeriod=10 Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.655408 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b5f8b59f-qjn6h"] Mar 10 15:31:41 crc kubenswrapper[4743]: E0310 15:31:41.655832 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerName="extract-content" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.655849 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerName="extract-content" Mar 10 15:31:41 crc kubenswrapper[4743]: E0310 15:31:41.655864 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerName="extract-utilities" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.655871 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerName="extract-utilities" Mar 10 15:31:41 crc kubenswrapper[4743]: E0310 15:31:41.655895 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerName="registry-server" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.655901 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerName="registry-server" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.656086 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" containerName="registry-server" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.657090 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.677211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b5f8b59f-qjn6h"] Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.722389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-openstack-edpm-ipam\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.722459 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-ovsdbserver-nb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.722569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-config\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.722599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-dns-svc\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.722655 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-dns-swift-storage-0\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.722706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-ovsdbserver-sb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.722736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwkb\" (UniqueName: \"kubernetes.io/projected/122f314d-cff5-4699-9d4c-c5221b9174ba-kube-api-access-dvwkb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.824672 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-config\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.824995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-dns-svc\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.825044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-dns-swift-storage-0\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.825101 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-ovsdbserver-sb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.825124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwkb\" (UniqueName: \"kubernetes.io/projected/122f314d-cff5-4699-9d4c-c5221b9174ba-kube-api-access-dvwkb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.825191 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-openstack-edpm-ipam\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.825220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-ovsdbserver-nb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.826138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-ovsdbserver-nb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.826726 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-ovsdbserver-sb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.827010 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-dns-swift-storage-0\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.827766 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-openstack-edpm-ipam\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.828619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-config\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.828993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/122f314d-cff5-4699-9d4c-c5221b9174ba-dns-svc\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.849128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwkb\" (UniqueName: \"kubernetes.io/projected/122f314d-cff5-4699-9d4c-c5221b9174ba-kube-api-access-dvwkb\") pod \"dnsmasq-dns-84b5f8b59f-qjn6h\" (UID: \"122f314d-cff5-4699-9d4c-c5221b9174ba\") " pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.927696 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b1c8bdd-3209-49e0-a2cf-c5abe1b83974" path="/var/lib/kubelet/pods/3b1c8bdd-3209-49e0-a2cf-c5abe1b83974/volumes" Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.934451 4743 generic.go:334] "Generic (PLEG): container finished" podID="17052016-cb68-4e74-82e3-05531596e17e" containerID="01d944739ade9c3156cc5e5e277939e2b77016237037ffdde1f2ecc1ceb56bef" exitCode=0 Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.934496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" event={"ID":"17052016-cb68-4e74-82e3-05531596e17e","Type":"ContainerDied","Data":"01d944739ade9c3156cc5e5e277939e2b77016237037ffdde1f2ecc1ceb56bef"} Mar 10 15:31:41 crc kubenswrapper[4743]: I0310 15:31:41.990570 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.056892 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.130401 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-config\") pod \"17052016-cb68-4e74-82e3-05531596e17e\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.130764 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-sb\") pod \"17052016-cb68-4e74-82e3-05531596e17e\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.130834 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-nb\") pod \"17052016-cb68-4e74-82e3-05531596e17e\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.130858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-swift-storage-0\") pod \"17052016-cb68-4e74-82e3-05531596e17e\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.130916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-svc\") pod \"17052016-cb68-4e74-82e3-05531596e17e\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.130936 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qpst\" (UniqueName: \"kubernetes.io/projected/17052016-cb68-4e74-82e3-05531596e17e-kube-api-access-7qpst\") pod \"17052016-cb68-4e74-82e3-05531596e17e\" (UID: \"17052016-cb68-4e74-82e3-05531596e17e\") " Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.148081 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17052016-cb68-4e74-82e3-05531596e17e-kube-api-access-7qpst" (OuterVolumeSpecName: "kube-api-access-7qpst") pod "17052016-cb68-4e74-82e3-05531596e17e" (UID: "17052016-cb68-4e74-82e3-05531596e17e"). InnerVolumeSpecName "kube-api-access-7qpst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.214053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-config" (OuterVolumeSpecName: "config") pod "17052016-cb68-4e74-82e3-05531596e17e" (UID: "17052016-cb68-4e74-82e3-05531596e17e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.223254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "17052016-cb68-4e74-82e3-05531596e17e" (UID: "17052016-cb68-4e74-82e3-05531596e17e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.223483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17052016-cb68-4e74-82e3-05531596e17e" (UID: "17052016-cb68-4e74-82e3-05531596e17e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.234460 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.234517 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.234532 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.234544 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qpst\" (UniqueName: \"kubernetes.io/projected/17052016-cb68-4e74-82e3-05531596e17e-kube-api-access-7qpst\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.237512 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17052016-cb68-4e74-82e3-05531596e17e" (UID: "17052016-cb68-4e74-82e3-05531596e17e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.237935 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17052016-cb68-4e74-82e3-05531596e17e" (UID: "17052016-cb68-4e74-82e3-05531596e17e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.338627 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.338657 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17052016-cb68-4e74-82e3-05531596e17e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.478221 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b5f8b59f-qjn6h"] Mar 10 15:31:42 crc kubenswrapper[4743]: W0310 15:31:42.480484 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod122f314d_cff5_4699_9d4c_c5221b9174ba.slice/crio-36889f829a2160d05f91b385108cdf9a5590e19fef1e04ca2f34aa4b2ad27816 WatchSource:0}: Error finding container 36889f829a2160d05f91b385108cdf9a5590e19fef1e04ca2f34aa4b2ad27816: Status 404 returned error can't find the container with id 36889f829a2160d05f91b385108cdf9a5590e19fef1e04ca2f34aa4b2ad27816 Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.945087 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" event={"ID":"17052016-cb68-4e74-82e3-05531596e17e","Type":"ContainerDied","Data":"b569082d7dfcd66dd4c11924ed9ae7395a2af7c83d27be3dc2553186be069683"} Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.945126 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958d5dc75-4297n" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.945442 4743 scope.go:117] "RemoveContainer" containerID="01d944739ade9c3156cc5e5e277939e2b77016237037ffdde1f2ecc1ceb56bef" Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.947052 4743 generic.go:334] "Generic (PLEG): container finished" podID="122f314d-cff5-4699-9d4c-c5221b9174ba" containerID="92e3ba0094d9e19dc1893f58f0e9c8e81492bdb38226f6e7f3279650128d95ae" exitCode=0 Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.947094 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" event={"ID":"122f314d-cff5-4699-9d4c-c5221b9174ba","Type":"ContainerDied","Data":"92e3ba0094d9e19dc1893f58f0e9c8e81492bdb38226f6e7f3279650128d95ae"} Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.947158 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" event={"ID":"122f314d-cff5-4699-9d4c-c5221b9174ba","Type":"ContainerStarted","Data":"36889f829a2160d05f91b385108cdf9a5590e19fef1e04ca2f34aa4b2ad27816"} Mar 10 15:31:42 crc kubenswrapper[4743]: I0310 15:31:42.980807 4743 scope.go:117] "RemoveContainer" containerID="77c33115a71c71745d87fcad68674d5d2dfb17a36aae322113646b810ccf6e40" Mar 10 15:31:43 crc kubenswrapper[4743]: I0310 15:31:43.179834 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-4297n"] Mar 10 15:31:43 crc kubenswrapper[4743]: I0310 15:31:43.192853 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-4297n"] Mar 10 15:31:43 crc kubenswrapper[4743]: I0310 15:31:43.507351 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:43 crc kubenswrapper[4743]: I0310 15:31:43.507416 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:43 crc kubenswrapper[4743]: I0310 15:31:43.926632 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17052016-cb68-4e74-82e3-05531596e17e" path="/var/lib/kubelet/pods/17052016-cb68-4e74-82e3-05531596e17e/volumes" Mar 10 15:31:43 crc kubenswrapper[4743]: I0310 15:31:43.968170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" event={"ID":"122f314d-cff5-4699-9d4c-c5221b9174ba","Type":"ContainerStarted","Data":"b5269302f0505acf9ed042636e36b440a68bd0866edec15b324286801b8f2e97"} Mar 10 15:31:43 crc kubenswrapper[4743]: I0310 15:31:43.968394 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:44 crc kubenswrapper[4743]: I0310 15:31:43.999041 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" podStartSLOduration=2.999015184 podStartE2EDuration="2.999015184s" podCreationTimestamp="2026-03-10 15:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:31:43.990220163 +0000 UTC m=+1568.697034921" watchObservedRunningTime="2026-03-10 15:31:43.999015184 +0000 UTC m=+1568.705829942" Mar 10 15:31:44 crc kubenswrapper[4743]: I0310 15:31:44.566240 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7zf95" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="registry-server" probeResult="failure" output=< Mar 10 15:31:44 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 15:31:44 crc kubenswrapper[4743]: > Mar 10 15:31:51 crc kubenswrapper[4743]: I0310 15:31:51.992155 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b5f8b59f-qjn6h" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.094286 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-rmbjj"] Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.095045 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" podUID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" containerName="dnsmasq-dns" containerID="cri-o://ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c" gracePeriod=10 Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.636586 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.796795 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-swift-storage-0\") pod \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.796988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-nb\") pod \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.797023 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-openstack-edpm-ipam\") pod \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.797054 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-config\") pod \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.797129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b4wb\" (UniqueName: \"kubernetes.io/projected/1d461cfd-9fd1-4674-97d2-65c42611c1d1-kube-api-access-4b4wb\") pod \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.797729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-svc\") pod \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.797768 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-sb\") pod \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\" (UID: \"1d461cfd-9fd1-4674-97d2-65c42611c1d1\") " Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.802547 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d461cfd-9fd1-4674-97d2-65c42611c1d1-kube-api-access-4b4wb" (OuterVolumeSpecName: "kube-api-access-4b4wb") pod "1d461cfd-9fd1-4674-97d2-65c42611c1d1" (UID: "1d461cfd-9fd1-4674-97d2-65c42611c1d1"). InnerVolumeSpecName "kube-api-access-4b4wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.848270 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1d461cfd-9fd1-4674-97d2-65c42611c1d1" (UID: "1d461cfd-9fd1-4674-97d2-65c42611c1d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.851375 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d461cfd-9fd1-4674-97d2-65c42611c1d1" (UID: "1d461cfd-9fd1-4674-97d2-65c42611c1d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.856663 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-config" (OuterVolumeSpecName: "config") pod "1d461cfd-9fd1-4674-97d2-65c42611c1d1" (UID: "1d461cfd-9fd1-4674-97d2-65c42611c1d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.862466 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "1d461cfd-9fd1-4674-97d2-65c42611c1d1" (UID: "1d461cfd-9fd1-4674-97d2-65c42611c1d1"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.863120 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1d461cfd-9fd1-4674-97d2-65c42611c1d1" (UID: "1d461cfd-9fd1-4674-97d2-65c42611c1d1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.864253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d461cfd-9fd1-4674-97d2-65c42611c1d1" (UID: "1d461cfd-9fd1-4674-97d2-65c42611c1d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.900399 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.900442 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.900484 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.900501 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b4wb\" (UniqueName: \"kubernetes.io/projected/1d461cfd-9fd1-4674-97d2-65c42611c1d1-kube-api-access-4b4wb\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.900514 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.900524 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:52 crc kubenswrapper[4743]: I0310 15:31:52.900559 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d461cfd-9fd1-4674-97d2-65c42611c1d1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.074478 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" containerID="ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c" exitCode=0 Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.074528 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.074540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" event={"ID":"1d461cfd-9fd1-4674-97d2-65c42611c1d1","Type":"ContainerDied","Data":"ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c"} Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.074575 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-rmbjj" event={"ID":"1d461cfd-9fd1-4674-97d2-65c42611c1d1","Type":"ContainerDied","Data":"a592dc35e263fa6beb325eff397df6a575c1ca19bd0f7cb81ac9d96694716714"} Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.074600 4743 scope.go:117] "RemoveContainer" containerID="ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.098524 4743 scope.go:117] "RemoveContainer" containerID="37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.114389 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-rmbjj"] Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.126075 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-rmbjj"] Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.136151 4743 scope.go:117] "RemoveContainer" containerID="ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c" Mar 10 15:31:53 crc kubenswrapper[4743]: E0310 15:31:53.136643 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c\": container with ID starting with ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c not found: ID does not exist" containerID="ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.136712 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c"} err="failed to get container status \"ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c\": rpc error: code = NotFound desc = could not find container \"ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c\": container with ID starting with ca70a7b95e6566b7b926858ef079f4388fda8431de23a9293575e0eef9ac489c not found: ID does not exist" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.136744 4743 scope.go:117] "RemoveContainer" containerID="37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a" Mar 10 15:31:53 crc kubenswrapper[4743]: E0310 15:31:53.137114 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a\": container with ID starting with 37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a not found: ID does not exist" containerID="37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.137423 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a"} err="failed to get container status \"37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a\": rpc error: code = NotFound desc = could not find container \"37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a\": container with ID starting with 37abe089d9aabd7cc24e0c7d2c8490f45ab69a19ea10cefa30e5fa035ddbc62a not found: ID does not exist" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.585165 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.649091 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.847922 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7zf95"] Mar 10 15:31:53 crc kubenswrapper[4743]: I0310 15:31:53.927642 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" path="/var/lib/kubelet/pods/1d461cfd-9fd1-4674-97d2-65c42611c1d1/volumes" Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.101465 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7zf95" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="registry-server" containerID="cri-o://abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5" gracePeriod=2 Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.628675 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.761723 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-catalog-content\") pod \"2d1204a0-99da-43e9-8060-00a1e906f044\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.761887 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-utilities\") pod \"2d1204a0-99da-43e9-8060-00a1e906f044\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.762047 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd7zh\" (UniqueName: \"kubernetes.io/projected/2d1204a0-99da-43e9-8060-00a1e906f044-kube-api-access-hd7zh\") pod \"2d1204a0-99da-43e9-8060-00a1e906f044\" (UID: \"2d1204a0-99da-43e9-8060-00a1e906f044\") " Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.762877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-utilities" (OuterVolumeSpecName: "utilities") pod "2d1204a0-99da-43e9-8060-00a1e906f044" (UID: "2d1204a0-99da-43e9-8060-00a1e906f044"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.776068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1204a0-99da-43e9-8060-00a1e906f044-kube-api-access-hd7zh" (OuterVolumeSpecName: "kube-api-access-hd7zh") pod "2d1204a0-99da-43e9-8060-00a1e906f044" (UID: "2d1204a0-99da-43e9-8060-00a1e906f044"). InnerVolumeSpecName "kube-api-access-hd7zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.864182 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd7zh\" (UniqueName: \"kubernetes.io/projected/2d1204a0-99da-43e9-8060-00a1e906f044-kube-api-access-hd7zh\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.864217 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.940665 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d1204a0-99da-43e9-8060-00a1e906f044" (UID: "2d1204a0-99da-43e9-8060-00a1e906f044"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:55 crc kubenswrapper[4743]: I0310 15:31:55.965880 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1204a0-99da-43e9-8060-00a1e906f044-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.114950 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d1204a0-99da-43e9-8060-00a1e906f044" containerID="abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5" exitCode=0 Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.115206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zf95" event={"ID":"2d1204a0-99da-43e9-8060-00a1e906f044","Type":"ContainerDied","Data":"abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5"} Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.115240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zf95" event={"ID":"2d1204a0-99da-43e9-8060-00a1e906f044","Type":"ContainerDied","Data":"8308fa50313c034eeb5d12ad32a6afaa5a35d57d89124d960abee3dd665b0bb2"} Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.115247 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zf95" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.115263 4743 scope.go:117] "RemoveContainer" containerID="abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.156011 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7zf95"] Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.161107 4743 scope.go:117] "RemoveContainer" containerID="ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.168234 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7zf95"] Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.208281 4743 scope.go:117] "RemoveContainer" containerID="fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.245335 4743 scope.go:117] "RemoveContainer" containerID="abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5" Mar 10 15:31:56 crc kubenswrapper[4743]: E0310 15:31:56.245930 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5\": container with ID starting with abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5 not found: ID does not exist" containerID="abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.245978 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5"} err="failed to get container status \"abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5\": rpc error: code = NotFound desc = could not find container \"abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5\": container with ID starting with abc65ba2899bcc3e4570c4211eb2a762063085444102c39c69ecdb8448709df5 not found: ID does not exist" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.246013 4743 scope.go:117] "RemoveContainer" containerID="ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766" Mar 10 15:31:56 crc kubenswrapper[4743]: E0310 15:31:56.246343 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766\": container with ID starting with ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766 not found: ID does not exist" containerID="ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.246387 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766"} err="failed to get container status \"ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766\": rpc error: code = NotFound desc = could not find container \"ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766\": container with ID starting with ff3c54fa77ab02626a0eeb62e76895b515ac0237632e47db7ecece7ede224766 not found: ID does not exist" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.246411 4743 scope.go:117] "RemoveContainer" containerID="fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51" Mar 10 15:31:56 crc kubenswrapper[4743]: E0310 15:31:56.246757 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51\": container with ID starting with fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51 not found: ID does not exist" containerID="fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51" Mar 10 15:31:56 crc kubenswrapper[4743]: I0310 15:31:56.246798 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51"} err="failed to get container status \"fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51\": rpc error: code = NotFound desc = could not find container \"fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51\": container with ID starting with fb9c1e721daff54e0afe3f5f29a3114ca6277cc9705c5e293211be80f4a63f51 not found: ID does not exist" Mar 10 15:31:57 crc kubenswrapper[4743]: I0310 15:31:57.943426 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" path="/var/lib/kubelet/pods/2d1204a0-99da-43e9-8060-00a1e906f044/volumes" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.153211 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552612-lq6pb"] Mar 10 15:32:00 crc kubenswrapper[4743]: E0310 15:32:00.154273 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="extract-utilities" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154297 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="extract-utilities" Mar 10 15:32:00 crc kubenswrapper[4743]: E0310 15:32:00.154335 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" containerName="init" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154347 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" containerName="init" Mar 10 15:32:00 crc kubenswrapper[4743]: E0310 15:32:00.154370 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17052016-cb68-4e74-82e3-05531596e17e" containerName="init" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154381 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="17052016-cb68-4e74-82e3-05531596e17e" containerName="init" Mar 10 15:32:00 crc kubenswrapper[4743]: E0310 15:32:00.154398 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154408 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4743]: E0310 15:32:00.154440 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" containerName="dnsmasq-dns" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154449 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" containerName="dnsmasq-dns" Mar 10 15:32:00 crc kubenswrapper[4743]: E0310 15:32:00.154461 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17052016-cb68-4e74-82e3-05531596e17e" containerName="dnsmasq-dns" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154471 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="17052016-cb68-4e74-82e3-05531596e17e" containerName="dnsmasq-dns" Mar 10 15:32:00 crc kubenswrapper[4743]: E0310 15:32:00.154497 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="extract-content" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154507 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="extract-content" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154879 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d461cfd-9fd1-4674-97d2-65c42611c1d1" containerName="dnsmasq-dns" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154917 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1204a0-99da-43e9-8060-00a1e906f044" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.154931 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="17052016-cb68-4e74-82e3-05531596e17e" containerName="dnsmasq-dns" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.159154 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552612-lq6pb" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.162048 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.162066 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.162307 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.164604 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552612-lq6pb"] Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.260174 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9pp4\" (UniqueName: \"kubernetes.io/projected/ed0eafa7-a122-440d-981f-f5b873fe6255-kube-api-access-l9pp4\") pod \"auto-csr-approver-29552612-lq6pb\" (UID: \"ed0eafa7-a122-440d-981f-f5b873fe6255\") " pod="openshift-infra/auto-csr-approver-29552612-lq6pb" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.363437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9pp4\" (UniqueName: \"kubernetes.io/projected/ed0eafa7-a122-440d-981f-f5b873fe6255-kube-api-access-l9pp4\") pod \"auto-csr-approver-29552612-lq6pb\" (UID: \"ed0eafa7-a122-440d-981f-f5b873fe6255\") " pod="openshift-infra/auto-csr-approver-29552612-lq6pb" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.391601 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9pp4\" (UniqueName: \"kubernetes.io/projected/ed0eafa7-a122-440d-981f-f5b873fe6255-kube-api-access-l9pp4\") pod \"auto-csr-approver-29552612-lq6pb\" (UID: \"ed0eafa7-a122-440d-981f-f5b873fe6255\") " pod="openshift-infra/auto-csr-approver-29552612-lq6pb" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.485595 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552612-lq6pb" Mar 10 15:32:00 crc kubenswrapper[4743]: I0310 15:32:00.972093 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552612-lq6pb"] Mar 10 15:32:01 crc kubenswrapper[4743]: I0310 15:32:01.172577 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552612-lq6pb" event={"ID":"ed0eafa7-a122-440d-981f-f5b873fe6255","Type":"ContainerStarted","Data":"3ace0a4ee15421e5bf8a3c7b28ef32473fc2fcfc60757d0d07b2c1043084f938"} Mar 10 15:32:03 crc kubenswrapper[4743]: I0310 15:32:03.193391 4743 generic.go:334] "Generic (PLEG): container finished" podID="ed0eafa7-a122-440d-981f-f5b873fe6255" containerID="71168cb29e252d52842812232b59f3ab91750eedb5242abe78a9583409ffdbb5" exitCode=0 Mar 10 15:32:03 crc kubenswrapper[4743]: I0310 15:32:03.193464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552612-lq6pb" event={"ID":"ed0eafa7-a122-440d-981f-f5b873fe6255","Type":"ContainerDied","Data":"71168cb29e252d52842812232b59f3ab91750eedb5242abe78a9583409ffdbb5"} Mar 10 15:32:04 crc kubenswrapper[4743]: I0310 15:32:04.713242 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552612-lq6pb" Mar 10 15:32:04 crc kubenswrapper[4743]: I0310 15:32:04.878612 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9pp4\" (UniqueName: \"kubernetes.io/projected/ed0eafa7-a122-440d-981f-f5b873fe6255-kube-api-access-l9pp4\") pod \"ed0eafa7-a122-440d-981f-f5b873fe6255\" (UID: \"ed0eafa7-a122-440d-981f-f5b873fe6255\") " Mar 10 15:32:04 crc kubenswrapper[4743]: I0310 15:32:04.887739 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0eafa7-a122-440d-981f-f5b873fe6255-kube-api-access-l9pp4" (OuterVolumeSpecName: "kube-api-access-l9pp4") pod "ed0eafa7-a122-440d-981f-f5b873fe6255" (UID: "ed0eafa7-a122-440d-981f-f5b873fe6255"). InnerVolumeSpecName "kube-api-access-l9pp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:32:04 crc kubenswrapper[4743]: I0310 15:32:04.981043 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9pp4\" (UniqueName: \"kubernetes.io/projected/ed0eafa7-a122-440d-981f-f5b873fe6255-kube-api-access-l9pp4\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.165313 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq"] Mar 10 15:32:05 crc kubenswrapper[4743]: E0310 15:32:05.166131 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0eafa7-a122-440d-981f-f5b873fe6255" containerName="oc" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.166237 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0eafa7-a122-440d-981f-f5b873fe6255" containerName="oc" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.166533 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0eafa7-a122-440d-981f-f5b873fe6255" containerName="oc" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.167452 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.172947 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.173051 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.173891 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.174041 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.188080 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq"] Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.218685 4743 generic.go:334] "Generic (PLEG): container finished" podID="e44585ef-8ab2-45e9-a4f3-f333629f433a" containerID="632a6bcd5d794e702c3680c538d3d3b6f0b309ba2bbd73f7f956b4754a37f125" exitCode=0 Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.218753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44585ef-8ab2-45e9-a4f3-f333629f433a","Type":"ContainerDied","Data":"632a6bcd5d794e702c3680c538d3d3b6f0b309ba2bbd73f7f956b4754a37f125"} Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.222054 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552612-lq6pb" event={"ID":"ed0eafa7-a122-440d-981f-f5b873fe6255","Type":"ContainerDied","Data":"3ace0a4ee15421e5bf8a3c7b28ef32473fc2fcfc60757d0d07b2c1043084f938"} Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.222180 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552612-lq6pb" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.222293 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ace0a4ee15421e5bf8a3c7b28ef32473fc2fcfc60757d0d07b2c1043084f938" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.286643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.286984 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spsv2\" (UniqueName: \"kubernetes.io/projected/dc9f4770-25d4-4119-b914-9cffc9049566-kube-api-access-spsv2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.287091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.287183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.389668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.389756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spsv2\" (UniqueName: \"kubernetes.io/projected/dc9f4770-25d4-4119-b914-9cffc9049566-kube-api-access-spsv2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.389832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.389862 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.395448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.396695 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.398994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.409873 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spsv2\" (UniqueName: \"kubernetes.io/projected/dc9f4770-25d4-4119-b914-9cffc9049566-kube-api-access-spsv2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.494580 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.796640 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552606-qc86p"] Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.808273 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552606-qc86p"] Mar 10 15:32:05 crc kubenswrapper[4743]: I0310 15:32:05.926409 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4537e57c-2a6d-44f9-9724-2ad061bed454" path="/var/lib/kubelet/pods/4537e57c-2a6d-44f9-9724-2ad061bed454/volumes" Mar 10 15:32:06 crc kubenswrapper[4743]: I0310 15:32:06.195895 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq"] Mar 10 15:32:06 crc kubenswrapper[4743]: W0310 15:32:06.198148 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc9f4770_25d4_4119_b914_9cffc9049566.slice/crio-82663562e28ce348c70c8d30e2f70b3ea36ebfb091f61374c46e05aa26fc7520 WatchSource:0}: Error finding container 82663562e28ce348c70c8d30e2f70b3ea36ebfb091f61374c46e05aa26fc7520: Status 404 returned error can't find the container with id 82663562e28ce348c70c8d30e2f70b3ea36ebfb091f61374c46e05aa26fc7520 Mar 10 15:32:06 crc kubenswrapper[4743]: I0310 15:32:06.234853 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" event={"ID":"dc9f4770-25d4-4119-b914-9cffc9049566","Type":"ContainerStarted","Data":"82663562e28ce348c70c8d30e2f70b3ea36ebfb091f61374c46e05aa26fc7520"} Mar 10 15:32:06 crc kubenswrapper[4743]: I0310 15:32:06.237344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44585ef-8ab2-45e9-a4f3-f333629f433a","Type":"ContainerStarted","Data":"bf755f8933415cb1b317fcfd84e31c7f59ca4c477ebbf17d810aa63419cbae30"} Mar 10 15:32:06 crc kubenswrapper[4743]: I0310 15:32:06.237570 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 15:32:06 crc kubenswrapper[4743]: I0310 15:32:06.258883 4743 generic.go:334] "Generic (PLEG): container finished" podID="b741008c-73ba-4516-bb63-05b066d7051b" containerID="6a1796b823d3d0a1c3653618ece41e830dddf20f307893cb2e75361e6bb719ed" exitCode=0 Mar 10 15:32:06 crc kubenswrapper[4743]: I0310 15:32:06.258941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b741008c-73ba-4516-bb63-05b066d7051b","Type":"ContainerDied","Data":"6a1796b823d3d0a1c3653618ece41e830dddf20f307893cb2e75361e6bb719ed"} Mar 10 15:32:06 crc kubenswrapper[4743]: I0310 15:32:06.295894 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.295868213 podStartE2EDuration="37.295868213s" podCreationTimestamp="2026-03-10 15:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:32:06.276480158 +0000 UTC m=+1590.983294916" watchObservedRunningTime="2026-03-10 15:32:06.295868213 +0000 UTC m=+1591.002682971" Mar 10 15:32:07 crc kubenswrapper[4743]: I0310 15:32:07.275147 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b741008c-73ba-4516-bb63-05b066d7051b","Type":"ContainerStarted","Data":"f2f6a53b429135eee24a4156193d3f9d68794757f13eb6ed5fe8c23e4db80b9d"} Mar 10 15:32:07 crc kubenswrapper[4743]: I0310 15:32:07.275674 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:32:07 crc kubenswrapper[4743]: I0310 15:32:07.306737 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.306711987 podStartE2EDuration="37.306711987s" podCreationTimestamp="2026-03-10 15:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:32:07.301649392 +0000 UTC m=+1592.008464130" watchObservedRunningTime="2026-03-10 15:32:07.306711987 +0000 UTC m=+1592.013526725" Mar 10 15:32:11 crc kubenswrapper[4743]: I0310 15:32:11.253735 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:32:11 crc kubenswrapper[4743]: I0310 15:32:11.256066 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:32:17 crc kubenswrapper[4743]: I0310 15:32:17.384930 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" event={"ID":"dc9f4770-25d4-4119-b914-9cffc9049566","Type":"ContainerStarted","Data":"7ba6ba856233deabf3c54cdb7efca8e5e5c951887499accc1ddd821b2afb6bf0"} Mar 10 15:32:17 crc kubenswrapper[4743]: I0310 15:32:17.405938 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" podStartSLOduration=1.7852471570000001 podStartE2EDuration="12.405906624s" podCreationTimestamp="2026-03-10 15:32:05 +0000 UTC" firstStartedPulling="2026-03-10 15:32:06.200501954 +0000 UTC m=+1590.907316702" lastFinishedPulling="2026-03-10 15:32:16.821161421 +0000 UTC m=+1601.527976169" observedRunningTime="2026-03-10 15:32:17.401095576 +0000 UTC m=+1602.107910364" watchObservedRunningTime="2026-03-10 15:32:17.405906624 +0000 UTC m=+1602.112721422" Mar 10 15:32:20 crc kubenswrapper[4743]: I0310 15:32:20.229089 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 15:32:21 crc kubenswrapper[4743]: I0310 15:32:21.301066 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:32:27 crc kubenswrapper[4743]: I0310 15:32:27.485511 4743 generic.go:334] "Generic (PLEG): container finished" podID="dc9f4770-25d4-4119-b914-9cffc9049566" containerID="7ba6ba856233deabf3c54cdb7efca8e5e5c951887499accc1ddd821b2afb6bf0" exitCode=0 Mar 10 15:32:27 crc kubenswrapper[4743]: I0310 15:32:27.485716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" event={"ID":"dc9f4770-25d4-4119-b914-9cffc9049566","Type":"ContainerDied","Data":"7ba6ba856233deabf3c54cdb7efca8e5e5c951887499accc1ddd821b2afb6bf0"} Mar 10 15:32:28 crc kubenswrapper[4743]: I0310 15:32:28.963752 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.107299 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-repo-setup-combined-ca-bundle\") pod \"dc9f4770-25d4-4119-b914-9cffc9049566\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.107353 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-ssh-key-openstack-edpm-ipam\") pod \"dc9f4770-25d4-4119-b914-9cffc9049566\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.107451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-inventory\") pod \"dc9f4770-25d4-4119-b914-9cffc9049566\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.107517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spsv2\" (UniqueName: \"kubernetes.io/projected/dc9f4770-25d4-4119-b914-9cffc9049566-kube-api-access-spsv2\") pod \"dc9f4770-25d4-4119-b914-9cffc9049566\" (UID: \"dc9f4770-25d4-4119-b914-9cffc9049566\") " Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.113701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9f4770-25d4-4119-b914-9cffc9049566-kube-api-access-spsv2" (OuterVolumeSpecName: "kube-api-access-spsv2") pod "dc9f4770-25d4-4119-b914-9cffc9049566" (UID: "dc9f4770-25d4-4119-b914-9cffc9049566"). InnerVolumeSpecName "kube-api-access-spsv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.114203 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "dc9f4770-25d4-4119-b914-9cffc9049566" (UID: "dc9f4770-25d4-4119-b914-9cffc9049566"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.140942 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dc9f4770-25d4-4119-b914-9cffc9049566" (UID: "dc9f4770-25d4-4119-b914-9cffc9049566"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.148738 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-inventory" (OuterVolumeSpecName: "inventory") pod "dc9f4770-25d4-4119-b914-9cffc9049566" (UID: "dc9f4770-25d4-4119-b914-9cffc9049566"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.210094 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.210281 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.210356 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc9f4770-25d4-4119-b914-9cffc9049566-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.210428 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spsv2\" (UniqueName: \"kubernetes.io/projected/dc9f4770-25d4-4119-b914-9cffc9049566-kube-api-access-spsv2\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.505253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" event={"ID":"dc9f4770-25d4-4119-b914-9cffc9049566","Type":"ContainerDied","Data":"82663562e28ce348c70c8d30e2f70b3ea36ebfb091f61374c46e05aa26fc7520"} Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.505502 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82663562e28ce348c70c8d30e2f70b3ea36ebfb091f61374c46e05aa26fc7520" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.505508 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.599518 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h"] Mar 10 15:32:29 crc kubenswrapper[4743]: E0310 15:32:29.600108 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9f4770-25d4-4119-b914-9cffc9049566" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.600134 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9f4770-25d4-4119-b914-9cffc9049566" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.600386 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9f4770-25d4-4119-b914-9cffc9049566" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.601395 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.603687 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.604045 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.604381 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.606549 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.672994 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h"] Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.769276 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.769354 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.769383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hm4l\" (UniqueName: \"kubernetes.io/projected/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-kube-api-access-2hm4l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.872053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.872361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hm4l\" (UniqueName: \"kubernetes.io/projected/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-kube-api-access-2hm4l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.873185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.876957 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.877552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.901230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hm4l\" (UniqueName: \"kubernetes.io/projected/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-kube-api-access-2hm4l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7kr4h\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:29 crc kubenswrapper[4743]: I0310 15:32:29.999638 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:30 crc kubenswrapper[4743]: I0310 15:32:30.573372 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h"] Mar 10 15:32:30 crc kubenswrapper[4743]: W0310 15:32:30.576649 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c7d6ed2_094e_4726_b1b3_6238cb505f5e.slice/crio-bc548fa2c0f3e075e57540512563ead7bdd1dcb682d2fbf4ee33bb8eece2e978 WatchSource:0}: Error finding container bc548fa2c0f3e075e57540512563ead7bdd1dcb682d2fbf4ee33bb8eece2e978: Status 404 returned error can't find the container with id bc548fa2c0f3e075e57540512563ead7bdd1dcb682d2fbf4ee33bb8eece2e978 Mar 10 15:32:31 crc kubenswrapper[4743]: I0310 15:32:31.526608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" event={"ID":"1c7d6ed2-094e-4726-b1b3-6238cb505f5e","Type":"ContainerStarted","Data":"1b341440823c54c8d712ffbaca01c9a9a8738c641854790d7506177bb4116507"} Mar 10 15:32:31 crc kubenswrapper[4743]: I0310 15:32:31.527008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" event={"ID":"1c7d6ed2-094e-4726-b1b3-6238cb505f5e","Type":"ContainerStarted","Data":"bc548fa2c0f3e075e57540512563ead7bdd1dcb682d2fbf4ee33bb8eece2e978"} Mar 10 15:32:31 crc kubenswrapper[4743]: I0310 15:32:31.551883 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" podStartSLOduration=2.043986853 podStartE2EDuration="2.551861156s" podCreationTimestamp="2026-03-10 15:32:29 +0000 UTC" firstStartedPulling="2026-03-10 15:32:30.578769401 +0000 UTC m=+1615.285584149" lastFinishedPulling="2026-03-10 15:32:31.086643704 +0000 UTC m=+1615.793458452" observedRunningTime="2026-03-10 15:32:31.546143203 +0000 UTC m=+1616.252957961" watchObservedRunningTime="2026-03-10 15:32:31.551861156 +0000 UTC m=+1616.258675904" Mar 10 15:32:34 crc kubenswrapper[4743]: I0310 15:32:34.565120 4743 generic.go:334] "Generic (PLEG): container finished" podID="1c7d6ed2-094e-4726-b1b3-6238cb505f5e" containerID="1b341440823c54c8d712ffbaca01c9a9a8738c641854790d7506177bb4116507" exitCode=0 Mar 10 15:32:34 crc kubenswrapper[4743]: I0310 15:32:34.565202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" event={"ID":"1c7d6ed2-094e-4726-b1b3-6238cb505f5e","Type":"ContainerDied","Data":"1b341440823c54c8d712ffbaca01c9a9a8738c641854790d7506177bb4116507"} Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.141550 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.263158 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-ssh-key-openstack-edpm-ipam\") pod \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.263393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hm4l\" (UniqueName: \"kubernetes.io/projected/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-kube-api-access-2hm4l\") pod \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.263538 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-inventory\") pod \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\" (UID: \"1c7d6ed2-094e-4726-b1b3-6238cb505f5e\") " Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.270856 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-kube-api-access-2hm4l" (OuterVolumeSpecName: "kube-api-access-2hm4l") pod "1c7d6ed2-094e-4726-b1b3-6238cb505f5e" (UID: "1c7d6ed2-094e-4726-b1b3-6238cb505f5e"). InnerVolumeSpecName "kube-api-access-2hm4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.332973 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1c7d6ed2-094e-4726-b1b3-6238cb505f5e" (UID: "1c7d6ed2-094e-4726-b1b3-6238cb505f5e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.365687 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.365714 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hm4l\" (UniqueName: \"kubernetes.io/projected/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-kube-api-access-2hm4l\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.386983 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-inventory" (OuterVolumeSpecName: "inventory") pod "1c7d6ed2-094e-4726-b1b3-6238cb505f5e" (UID: "1c7d6ed2-094e-4726-b1b3-6238cb505f5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.469383 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d6ed2-094e-4726-b1b3-6238cb505f5e-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.594112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" event={"ID":"1c7d6ed2-094e-4726-b1b3-6238cb505f5e","Type":"ContainerDied","Data":"bc548fa2c0f3e075e57540512563ead7bdd1dcb682d2fbf4ee33bb8eece2e978"} Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.594428 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc548fa2c0f3e075e57540512563ead7bdd1dcb682d2fbf4ee33bb8eece2e978" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.594378 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7kr4h" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.677071 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4"] Mar 10 15:32:36 crc kubenswrapper[4743]: E0310 15:32:36.677508 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7d6ed2-094e-4726-b1b3-6238cb505f5e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.677530 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7d6ed2-094e-4726-b1b3-6238cb505f5e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.677707 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7d6ed2-094e-4726-b1b3-6238cb505f5e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.686976 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4"] Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.687181 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.689106 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.689174 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.689993 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.690152 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.774330 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.774457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b47xx\" (UniqueName: \"kubernetes.io/projected/5806fcf8-1c71-408a-b87a-c4574daf14b6-kube-api-access-b47xx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.774530 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.774763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.876415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b47xx\" (UniqueName: \"kubernetes.io/projected/5806fcf8-1c71-408a-b87a-c4574daf14b6-kube-api-access-b47xx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.876499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.876567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.876854 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.882205 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.882232 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.882473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:36 crc kubenswrapper[4743]: I0310 15:32:36.898333 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b47xx\" (UniqueName: \"kubernetes.io/projected/5806fcf8-1c71-408a-b87a-c4574daf14b6-kube-api-access-b47xx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:37 crc kubenswrapper[4743]: I0310 15:32:37.042392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:32:37 crc kubenswrapper[4743]: I0310 15:32:37.583439 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4"] Mar 10 15:32:37 crc kubenswrapper[4743]: I0310 15:32:37.616227 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" event={"ID":"5806fcf8-1c71-408a-b87a-c4574daf14b6","Type":"ContainerStarted","Data":"028ae68232e7340477b67dc7ab880d9cded6ca907828112f83b1976d232c5d4e"} Mar 10 15:32:38 crc kubenswrapper[4743]: I0310 15:32:38.629147 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" event={"ID":"5806fcf8-1c71-408a-b87a-c4574daf14b6","Type":"ContainerStarted","Data":"1fa2f4a4b67431d08341359537af951cb1002bccca3e8f572904f27a1cd2ab5f"} Mar 10 15:32:38 crc kubenswrapper[4743]: I0310 15:32:38.648558 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" podStartSLOduration=2.185174558 podStartE2EDuration="2.648534796s" podCreationTimestamp="2026-03-10 15:32:36 +0000 UTC" firstStartedPulling="2026-03-10 15:32:37.602981133 +0000 UTC m=+1622.309795881" lastFinishedPulling="2026-03-10 15:32:38.066341361 +0000 UTC m=+1622.773156119" observedRunningTime="2026-03-10 15:32:38.647495127 +0000 UTC m=+1623.354309875" watchObservedRunningTime="2026-03-10 15:32:38.648534796 +0000 UTC m=+1623.355349544" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.136934 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-96xd7"] Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.139773 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.153579 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-96xd7"] Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.252306 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.252375 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.252434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.253331 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.253398 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" gracePeriod=600 Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.294110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-utilities\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.294444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-catalog-content\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.294499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qb5r\" (UniqueName: \"kubernetes.io/projected/0b2eaabe-9056-4bb1-a91c-2c774893fad2-kube-api-access-4qb5r\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: E0310 15:32:41.378429 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.396465 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-utilities\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.396523 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-catalog-content\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.396566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qb5r\" (UniqueName: \"kubernetes.io/projected/0b2eaabe-9056-4bb1-a91c-2c774893fad2-kube-api-access-4qb5r\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.397130 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-utilities\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.397303 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-catalog-content\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.419969 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qb5r\" (UniqueName: \"kubernetes.io/projected/0b2eaabe-9056-4bb1-a91c-2c774893fad2-kube-api-access-4qb5r\") pod \"community-operators-96xd7\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.505931 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.723006 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" exitCode=0 Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.723320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d"} Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.723374 4743 scope.go:117] "RemoveContainer" containerID="fd61460384d6cf2cf3d97e1581a2275d487c44dd87d687ca072c07eb9d139f79" Mar 10 15:32:41 crc kubenswrapper[4743]: I0310 15:32:41.724088 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:32:41 crc kubenswrapper[4743]: E0310 15:32:41.724416 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:32:42 crc kubenswrapper[4743]: I0310 15:32:42.078489 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-96xd7"] Mar 10 15:32:42 crc kubenswrapper[4743]: E0310 15:32:42.387151 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b2eaabe_9056_4bb1_a91c_2c774893fad2.slice/crio-76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b2eaabe_9056_4bb1_a91c_2c774893fad2.slice/crio-conmon-76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:32:42 crc kubenswrapper[4743]: I0310 15:32:42.735466 4743 generic.go:334] "Generic (PLEG): container finished" podID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerID="76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35" exitCode=0 Mar 10 15:32:42 crc kubenswrapper[4743]: I0310 15:32:42.735567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96xd7" event={"ID":"0b2eaabe-9056-4bb1-a91c-2c774893fad2","Type":"ContainerDied","Data":"76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35"} Mar 10 15:32:42 crc kubenswrapper[4743]: I0310 15:32:42.736446 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96xd7" event={"ID":"0b2eaabe-9056-4bb1-a91c-2c774893fad2","Type":"ContainerStarted","Data":"316cc7e1e66fe057166154725850a305f63b5fbe75e0e38920ae53c5b8c514d8"} Mar 10 15:32:44 crc kubenswrapper[4743]: I0310 15:32:44.764465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96xd7" event={"ID":"0b2eaabe-9056-4bb1-a91c-2c774893fad2","Type":"ContainerStarted","Data":"c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a"} Mar 10 15:32:46 crc kubenswrapper[4743]: I0310 15:32:46.786699 4743 generic.go:334] "Generic (PLEG): container finished" podID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerID="c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a" exitCode=0 Mar 10 15:32:46 crc kubenswrapper[4743]: I0310 15:32:46.786783 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96xd7" event={"ID":"0b2eaabe-9056-4bb1-a91c-2c774893fad2","Type":"ContainerDied","Data":"c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a"} Mar 10 15:32:47 crc kubenswrapper[4743]: I0310 15:32:47.799593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96xd7" event={"ID":"0b2eaabe-9056-4bb1-a91c-2c774893fad2","Type":"ContainerStarted","Data":"5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba"} Mar 10 15:32:47 crc kubenswrapper[4743]: I0310 15:32:47.822632 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-96xd7" podStartSLOduration=2.322088867 podStartE2EDuration="6.822610585s" podCreationTimestamp="2026-03-10 15:32:41 +0000 UTC" firstStartedPulling="2026-03-10 15:32:42.73842396 +0000 UTC m=+1627.445238738" lastFinishedPulling="2026-03-10 15:32:47.238945668 +0000 UTC m=+1631.945760456" observedRunningTime="2026-03-10 15:32:47.816778659 +0000 UTC m=+1632.523593417" watchObservedRunningTime="2026-03-10 15:32:47.822610585 +0000 UTC m=+1632.529425333" Mar 10 15:32:51 crc kubenswrapper[4743]: I0310 15:32:51.506869 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:51 crc kubenswrapper[4743]: I0310 15:32:51.507423 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:51 crc kubenswrapper[4743]: I0310 15:32:51.592680 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:32:52 crc kubenswrapper[4743]: I0310 15:32:52.353696 4743 scope.go:117] "RemoveContainer" containerID="f42da41bad26b0eb018c23cb059e91151a8898f2abaa19696fef521470bd0419" Mar 10 15:32:52 crc kubenswrapper[4743]: I0310 15:32:52.385962 4743 scope.go:117] "RemoveContainer" containerID="c3987d1d2cee35f5b6f2659fd2201aebc1fa70e42902b873ee1ac8fbf669cc47" Mar 10 15:32:52 crc kubenswrapper[4743]: I0310 15:32:52.497609 4743 scope.go:117] "RemoveContainer" containerID="1d8697df409a13c9ff4e89cae95eb036e351e739db865e100680a3a6b05918d7" Mar 10 15:32:52 crc kubenswrapper[4743]: I0310 15:32:52.551119 4743 scope.go:117] "RemoveContainer" containerID="85961ca2d9d955f64b34fa59578fe2c676024233d93b9fc00272972bd27d3b40" Mar 10 15:32:53 crc kubenswrapper[4743]: I0310 15:32:53.916170 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:32:53 crc kubenswrapper[4743]: E0310 15:32:53.916832 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:33:01 crc kubenswrapper[4743]: I0310 15:33:01.553210 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:33:01 crc kubenswrapper[4743]: I0310 15:33:01.612557 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-96xd7"] Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.033390 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-96xd7" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerName="registry-server" containerID="cri-o://5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba" gracePeriod=2 Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.591218 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.784691 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-catalog-content\") pod \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.785044 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qb5r\" (UniqueName: \"kubernetes.io/projected/0b2eaabe-9056-4bb1-a91c-2c774893fad2-kube-api-access-4qb5r\") pod \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.785152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-utilities\") pod \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\" (UID: \"0b2eaabe-9056-4bb1-a91c-2c774893fad2\") " Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.786173 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-utilities" (OuterVolumeSpecName: "utilities") pod "0b2eaabe-9056-4bb1-a91c-2c774893fad2" (UID: "0b2eaabe-9056-4bb1-a91c-2c774893fad2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.791807 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2eaabe-9056-4bb1-a91c-2c774893fad2-kube-api-access-4qb5r" (OuterVolumeSpecName: "kube-api-access-4qb5r") pod "0b2eaabe-9056-4bb1-a91c-2c774893fad2" (UID: "0b2eaabe-9056-4bb1-a91c-2c774893fad2"). InnerVolumeSpecName "kube-api-access-4qb5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.845161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b2eaabe-9056-4bb1-a91c-2c774893fad2" (UID: "0b2eaabe-9056-4bb1-a91c-2c774893fad2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.887242 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.887297 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qb5r\" (UniqueName: \"kubernetes.io/projected/0b2eaabe-9056-4bb1-a91c-2c774893fad2-kube-api-access-4qb5r\") on node \"crc\" DevicePath \"\"" Mar 10 15:33:02 crc kubenswrapper[4743]: I0310 15:33:02.887311 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2eaabe-9056-4bb1-a91c-2c774893fad2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.046539 4743 generic.go:334] "Generic (PLEG): container finished" podID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerID="5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba" exitCode=0 Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.046618 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96xd7" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.046600 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96xd7" event={"ID":"0b2eaabe-9056-4bb1-a91c-2c774893fad2","Type":"ContainerDied","Data":"5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba"} Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.046770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96xd7" event={"ID":"0b2eaabe-9056-4bb1-a91c-2c774893fad2","Type":"ContainerDied","Data":"316cc7e1e66fe057166154725850a305f63b5fbe75e0e38920ae53c5b8c514d8"} Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.046802 4743 scope.go:117] "RemoveContainer" containerID="5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.088018 4743 scope.go:117] "RemoveContainer" containerID="c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.096979 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-96xd7"] Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.109046 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-96xd7"] Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.128187 4743 scope.go:117] "RemoveContainer" containerID="76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.179143 4743 scope.go:117] "RemoveContainer" containerID="5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba" Mar 10 15:33:03 crc kubenswrapper[4743]: E0310 15:33:03.180004 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba\": container with ID starting with 5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba not found: ID does not exist" containerID="5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.180059 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba"} err="failed to get container status \"5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba\": rpc error: code = NotFound desc = could not find container \"5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba\": container with ID starting with 5c6daddb622b77b1ae98872d3df1b006c6d01a09772e20230ea0c476c0349bba not found: ID does not exist" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.180096 4743 scope.go:117] "RemoveContainer" containerID="c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a" Mar 10 15:33:03 crc kubenswrapper[4743]: E0310 15:33:03.180543 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a\": container with ID starting with c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a not found: ID does not exist" containerID="c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.180602 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a"} err="failed to get container status \"c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a\": rpc error: code = NotFound desc = could not find container \"c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a\": container with ID starting with c983cd723eca80910aa85bca8c982c168e92ada30dde4e2f0d350aecfbf21a2a not found: ID does not exist" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.180631 4743 scope.go:117] "RemoveContainer" containerID="76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35" Mar 10 15:33:03 crc kubenswrapper[4743]: E0310 15:33:03.181199 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35\": container with ID starting with 76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35 not found: ID does not exist" containerID="76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.181242 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35"} err="failed to get container status \"76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35\": rpc error: code = NotFound desc = could not find container \"76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35\": container with ID starting with 76b9d0ea8d8d0a660daa515c06d18c8793dd6ef108fe3a9b69b9fd746b319b35 not found: ID does not exist" Mar 10 15:33:03 crc kubenswrapper[4743]: I0310 15:33:03.930974 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" path="/var/lib/kubelet/pods/0b2eaabe-9056-4bb1-a91c-2c774893fad2/volumes" Mar 10 15:33:05 crc kubenswrapper[4743]: I0310 15:33:05.925947 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:33:05 crc kubenswrapper[4743]: E0310 15:33:05.926456 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:33:20 crc kubenswrapper[4743]: I0310 15:33:20.915157 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:33:20 crc kubenswrapper[4743]: E0310 15:33:20.915937 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:33:32 crc kubenswrapper[4743]: I0310 15:33:32.915589 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:33:32 crc kubenswrapper[4743]: E0310 15:33:32.917321 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:33:47 crc kubenswrapper[4743]: I0310 15:33:47.915547 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:33:47 crc kubenswrapper[4743]: E0310 15:33:47.916559 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:33:52 crc kubenswrapper[4743]: I0310 15:33:52.746658 4743 scope.go:117] "RemoveContainer" containerID="5bb36f4a4448b1c9b4f7e960684bb9b8fc1f491337262ba5647a451cad09750a" Mar 10 15:33:52 crc kubenswrapper[4743]: I0310 15:33:52.787732 4743 scope.go:117] "RemoveContainer" containerID="026b45c8e11d839a6f4981e884df8bf077362712339e5f1f160cbd9c38fba3e2" Mar 10 15:33:52 crc kubenswrapper[4743]: I0310 15:33:52.842554 4743 scope.go:117] "RemoveContainer" containerID="e0d52e7ff40eba6b44a945eb6d4ac6091b5525650d674c73187ec2dab5c70c36" Mar 10 15:33:59 crc kubenswrapper[4743]: I0310 15:33:59.915531 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:33:59 crc kubenswrapper[4743]: E0310 15:33:59.916429 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.155712 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552614-q49mz"] Mar 10 15:34:00 crc kubenswrapper[4743]: E0310 15:34:00.156259 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerName="registry-server" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.156282 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerName="registry-server" Mar 10 15:34:00 crc kubenswrapper[4743]: E0310 15:34:00.156310 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerName="extract-utilities" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.156318 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerName="extract-utilities" Mar 10 15:34:00 crc kubenswrapper[4743]: E0310 15:34:00.156335 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerName="extract-content" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.156341 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerName="extract-content" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.156534 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2eaabe-9056-4bb1-a91c-2c774893fad2" containerName="registry-server" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.157181 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552614-q49mz" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.159734 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.159849 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.161065 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.177445 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552614-q49mz"] Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.273139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8flg2\" (UniqueName: \"kubernetes.io/projected/519df03f-1942-4901-b698-7f2d2703704b-kube-api-access-8flg2\") pod \"auto-csr-approver-29552614-q49mz\" (UID: \"519df03f-1942-4901-b698-7f2d2703704b\") " pod="openshift-infra/auto-csr-approver-29552614-q49mz" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.375646 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8flg2\" (UniqueName: \"kubernetes.io/projected/519df03f-1942-4901-b698-7f2d2703704b-kube-api-access-8flg2\") pod \"auto-csr-approver-29552614-q49mz\" (UID: \"519df03f-1942-4901-b698-7f2d2703704b\") " pod="openshift-infra/auto-csr-approver-29552614-q49mz" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.398453 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8flg2\" (UniqueName: \"kubernetes.io/projected/519df03f-1942-4901-b698-7f2d2703704b-kube-api-access-8flg2\") pod \"auto-csr-approver-29552614-q49mz\" (UID: \"519df03f-1942-4901-b698-7f2d2703704b\") " pod="openshift-infra/auto-csr-approver-29552614-q49mz" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.479539 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552614-q49mz" Mar 10 15:34:00 crc kubenswrapper[4743]: I0310 15:34:00.954124 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552614-q49mz"] Mar 10 15:34:00 crc kubenswrapper[4743]: W0310 15:34:00.960003 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod519df03f_1942_4901_b698_7f2d2703704b.slice/crio-f2ab2f2eab6c6b82d6c18828fa6e218ecb52e3a6357568c83d2f2f2da353493c WatchSource:0}: Error finding container f2ab2f2eab6c6b82d6c18828fa6e218ecb52e3a6357568c83d2f2f2da353493c: Status 404 returned error can't find the container with id f2ab2f2eab6c6b82d6c18828fa6e218ecb52e3a6357568c83d2f2f2da353493c Mar 10 15:34:01 crc kubenswrapper[4743]: I0310 15:34:01.777681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552614-q49mz" event={"ID":"519df03f-1942-4901-b698-7f2d2703704b","Type":"ContainerStarted","Data":"f2ab2f2eab6c6b82d6c18828fa6e218ecb52e3a6357568c83d2f2f2da353493c"} Mar 10 15:34:02 crc kubenswrapper[4743]: I0310 15:34:02.787847 4743 generic.go:334] "Generic (PLEG): container finished" podID="519df03f-1942-4901-b698-7f2d2703704b" containerID="7f23faabcfffc57c70fc703b1ec09963ac7b1d751a3a2e37f2f723d3c983a24e" exitCode=0 Mar 10 15:34:02 crc kubenswrapper[4743]: I0310 15:34:02.787943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552614-q49mz" event={"ID":"519df03f-1942-4901-b698-7f2d2703704b","Type":"ContainerDied","Data":"7f23faabcfffc57c70fc703b1ec09963ac7b1d751a3a2e37f2f723d3c983a24e"} Mar 10 15:34:04 crc kubenswrapper[4743]: I0310 15:34:04.154098 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552614-q49mz" Mar 10 15:34:04 crc kubenswrapper[4743]: I0310 15:34:04.266897 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8flg2\" (UniqueName: \"kubernetes.io/projected/519df03f-1942-4901-b698-7f2d2703704b-kube-api-access-8flg2\") pod \"519df03f-1942-4901-b698-7f2d2703704b\" (UID: \"519df03f-1942-4901-b698-7f2d2703704b\") " Mar 10 15:34:04 crc kubenswrapper[4743]: I0310 15:34:04.275434 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519df03f-1942-4901-b698-7f2d2703704b-kube-api-access-8flg2" (OuterVolumeSpecName: "kube-api-access-8flg2") pod "519df03f-1942-4901-b698-7f2d2703704b" (UID: "519df03f-1942-4901-b698-7f2d2703704b"). InnerVolumeSpecName "kube-api-access-8flg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:34:04 crc kubenswrapper[4743]: I0310 15:34:04.370539 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8flg2\" (UniqueName: \"kubernetes.io/projected/519df03f-1942-4901-b698-7f2d2703704b-kube-api-access-8flg2\") on node \"crc\" DevicePath \"\"" Mar 10 15:34:04 crc kubenswrapper[4743]: I0310 15:34:04.810793 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552614-q49mz" event={"ID":"519df03f-1942-4901-b698-7f2d2703704b","Type":"ContainerDied","Data":"f2ab2f2eab6c6b82d6c18828fa6e218ecb52e3a6357568c83d2f2f2da353493c"} Mar 10 15:34:04 crc kubenswrapper[4743]: I0310 15:34:04.810857 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2ab2f2eab6c6b82d6c18828fa6e218ecb52e3a6357568c83d2f2f2da353493c" Mar 10 15:34:04 crc kubenswrapper[4743]: I0310 15:34:04.811111 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552614-q49mz" Mar 10 15:34:05 crc kubenswrapper[4743]: I0310 15:34:05.236909 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552608-5gj6v"] Mar 10 15:34:05 crc kubenswrapper[4743]: I0310 15:34:05.244558 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552608-5gj6v"] Mar 10 15:34:05 crc kubenswrapper[4743]: I0310 15:34:05.932761 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c5df5c-43af-4b40-8a2d-1db9b79a699b" path="/var/lib/kubelet/pods/e1c5df5c-43af-4b40-8a2d-1db9b79a699b/volumes" Mar 10 15:34:10 crc kubenswrapper[4743]: I0310 15:34:10.916000 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:34:10 crc kubenswrapper[4743]: E0310 15:34:10.917069 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:34:24 crc kubenswrapper[4743]: I0310 15:34:24.916216 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:34:24 crc kubenswrapper[4743]: E0310 15:34:24.917083 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:34:35 crc kubenswrapper[4743]: I0310 15:34:35.940797 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:34:35 crc kubenswrapper[4743]: E0310 15:34:35.942226 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:34:48 crc kubenswrapper[4743]: I0310 15:34:48.915700 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:34:48 crc kubenswrapper[4743]: E0310 15:34:48.916450 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:34:52 crc kubenswrapper[4743]: I0310 15:34:52.941794 4743 scope.go:117] "RemoveContainer" containerID="a5346a8a65619168cc227a55e1d4bfdb1f32ab8057f446183674c19eabe53427" Mar 10 15:34:52 crc kubenswrapper[4743]: I0310 15:34:52.966783 4743 scope.go:117] "RemoveContainer" containerID="57d71a9b49c92580bb1171b0636f48c22d22afa7827c7b2cd4c6a9807980e759" Mar 10 15:34:52 crc kubenswrapper[4743]: I0310 15:34:52.995879 4743 scope.go:117] "RemoveContainer" containerID="7855c5beceb4ee5a4847552ceae24450b14a4577211aa818bd718a1b1bc4f6a1" Mar 10 15:34:53 crc kubenswrapper[4743]: I0310 15:34:53.025258 4743 scope.go:117] "RemoveContainer" containerID="878a24c679391f55b2dd107b7b07c7f9c11c9a18caef59e9ad5d91074f785300" Mar 10 15:34:53 crc kubenswrapper[4743]: I0310 15:34:53.073736 4743 scope.go:117] "RemoveContainer" containerID="6010b19f4be05d160cf049a7d05bf3cbadafd0e2a58035676a5bee21950c3aec" Mar 10 15:34:53 crc kubenswrapper[4743]: I0310 15:34:53.102942 4743 scope.go:117] "RemoveContainer" containerID="3037732e4bc0559e7b6cde4cfd03ea550f68128efb421bf7c4edb48f3c0b9e62" Mar 10 15:34:59 crc kubenswrapper[4743]: I0310 15:34:59.917985 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:34:59 crc kubenswrapper[4743]: E0310 15:34:59.918889 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:35:12 crc kubenswrapper[4743]: I0310 15:35:12.915994 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:35:12 crc kubenswrapper[4743]: E0310 15:35:12.917013 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:35:25 crc kubenswrapper[4743]: I0310 15:35:25.925879 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:35:25 crc kubenswrapper[4743]: E0310 15:35:25.926763 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:35:30 crc kubenswrapper[4743]: I0310 15:35:30.897849 4743 generic.go:334] "Generic (PLEG): container finished" podID="5806fcf8-1c71-408a-b87a-c4574daf14b6" containerID="1fa2f4a4b67431d08341359537af951cb1002bccca3e8f572904f27a1cd2ab5f" exitCode=0 Mar 10 15:35:30 crc kubenswrapper[4743]: I0310 15:35:30.897975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" event={"ID":"5806fcf8-1c71-408a-b87a-c4574daf14b6","Type":"ContainerDied","Data":"1fa2f4a4b67431d08341359537af951cb1002bccca3e8f572904f27a1cd2ab5f"} Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.412005 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.509135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-ssh-key-openstack-edpm-ipam\") pod \"5806fcf8-1c71-408a-b87a-c4574daf14b6\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.509304 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-bootstrap-combined-ca-bundle\") pod \"5806fcf8-1c71-408a-b87a-c4574daf14b6\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.509508 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b47xx\" (UniqueName: \"kubernetes.io/projected/5806fcf8-1c71-408a-b87a-c4574daf14b6-kube-api-access-b47xx\") pod \"5806fcf8-1c71-408a-b87a-c4574daf14b6\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.509542 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-inventory\") pod \"5806fcf8-1c71-408a-b87a-c4574daf14b6\" (UID: \"5806fcf8-1c71-408a-b87a-c4574daf14b6\") " Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.522197 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5806fcf8-1c71-408a-b87a-c4574daf14b6-kube-api-access-b47xx" (OuterVolumeSpecName: "kube-api-access-b47xx") pod "5806fcf8-1c71-408a-b87a-c4574daf14b6" (UID: "5806fcf8-1c71-408a-b87a-c4574daf14b6"). InnerVolumeSpecName "kube-api-access-b47xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.522197 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5806fcf8-1c71-408a-b87a-c4574daf14b6" (UID: "5806fcf8-1c71-408a-b87a-c4574daf14b6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.541626 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-inventory" (OuterVolumeSpecName: "inventory") pod "5806fcf8-1c71-408a-b87a-c4574daf14b6" (UID: "5806fcf8-1c71-408a-b87a-c4574daf14b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.551989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5806fcf8-1c71-408a-b87a-c4574daf14b6" (UID: "5806fcf8-1c71-408a-b87a-c4574daf14b6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.612594 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.612691 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.612711 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b47xx\" (UniqueName: \"kubernetes.io/projected/5806fcf8-1c71-408a-b87a-c4574daf14b6-kube-api-access-b47xx\") on node \"crc\" DevicePath \"\"" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.612730 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5806fcf8-1c71-408a-b87a-c4574daf14b6-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.927113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" event={"ID":"5806fcf8-1c71-408a-b87a-c4574daf14b6","Type":"ContainerDied","Data":"028ae68232e7340477b67dc7ab880d9cded6ca907828112f83b1976d232c5d4e"} Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.927195 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="028ae68232e7340477b67dc7ab880d9cded6ca907828112f83b1976d232c5d4e" Mar 10 15:35:32 crc kubenswrapper[4743]: I0310 15:35:32.928028 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.090950 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c"] Mar 10 15:35:33 crc kubenswrapper[4743]: E0310 15:35:33.091979 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519df03f-1942-4901-b698-7f2d2703704b" containerName="oc" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.092186 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="519df03f-1942-4901-b698-7f2d2703704b" containerName="oc" Mar 10 15:35:33 crc kubenswrapper[4743]: E0310 15:35:33.103953 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5806fcf8-1c71-408a-b87a-c4574daf14b6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.103977 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5806fcf8-1c71-408a-b87a-c4574daf14b6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.104385 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="519df03f-1942-4901-b698-7f2d2703704b" containerName="oc" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.104428 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5806fcf8-1c71-408a-b87a-c4574daf14b6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.105159 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c"] Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.105262 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.107807 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.108028 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.110538 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.114754 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.229015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.230025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlthq\" (UniqueName: \"kubernetes.io/projected/a14f160d-bf98-45da-a719-d8937e9281b0-kube-api-access-qlthq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.230137 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.331956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlthq\" (UniqueName: \"kubernetes.io/projected/a14f160d-bf98-45da-a719-d8937e9281b0-kube-api-access-qlthq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.332341 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.332442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.341597 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.341956 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.350525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlthq\" (UniqueName: \"kubernetes.io/projected/a14f160d-bf98-45da-a719-d8937e9281b0-kube-api-access-qlthq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6k57c\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.427944 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:35:33 crc kubenswrapper[4743]: I0310 15:35:33.996338 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c"] Mar 10 15:35:34 crc kubenswrapper[4743]: W0310 15:35:34.022908 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda14f160d_bf98_45da_a719_d8937e9281b0.slice/crio-9f10ca954a040c407e248894fe486f4e282e434075c11051113926bd0b73edca WatchSource:0}: Error finding container 9f10ca954a040c407e248894fe486f4e282e434075c11051113926bd0b73edca: Status 404 returned error can't find the container with id 9f10ca954a040c407e248894fe486f4e282e434075c11051113926bd0b73edca Mar 10 15:35:34 crc kubenswrapper[4743]: I0310 15:35:34.953005 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" event={"ID":"a14f160d-bf98-45da-a719-d8937e9281b0","Type":"ContainerStarted","Data":"9f10ca954a040c407e248894fe486f4e282e434075c11051113926bd0b73edca"} Mar 10 15:35:35 crc kubenswrapper[4743]: I0310 15:35:35.962028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" event={"ID":"a14f160d-bf98-45da-a719-d8937e9281b0","Type":"ContainerStarted","Data":"cc095ed95f535f37ee5c1b1d4a537e809efb83c38ea4051eb57736cf6becb59f"} Mar 10 15:35:35 crc kubenswrapper[4743]: I0310 15:35:35.981019 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" podStartSLOduration=2.282615738 podStartE2EDuration="2.980997784s" podCreationTimestamp="2026-03-10 15:35:33 +0000 UTC" firstStartedPulling="2026-03-10 15:35:34.02934478 +0000 UTC m=+1798.736159528" lastFinishedPulling="2026-03-10 15:35:34.727726786 +0000 UTC m=+1799.434541574" observedRunningTime="2026-03-10 15:35:35.979476661 +0000 UTC m=+1800.686291419" watchObservedRunningTime="2026-03-10 15:35:35.980997784 +0000 UTC m=+1800.687812532" Mar 10 15:35:38 crc kubenswrapper[4743]: I0310 15:35:38.915210 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:35:38 crc kubenswrapper[4743]: E0310 15:35:38.916145 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:35:50 crc kubenswrapper[4743]: I0310 15:35:50.916559 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:35:50 crc kubenswrapper[4743]: E0310 15:35:50.919333 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.161387 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552616-v5mqc"] Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.163804 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552616-v5mqc" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.166302 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.166633 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.166851 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.185793 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552616-v5mqc"] Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.306791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8frl\" (UniqueName: \"kubernetes.io/projected/1252848b-7af8-4167-aa0d-10efdc284e40-kube-api-access-f8frl\") pod \"auto-csr-approver-29552616-v5mqc\" (UID: \"1252848b-7af8-4167-aa0d-10efdc284e40\") " pod="openshift-infra/auto-csr-approver-29552616-v5mqc" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.409053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8frl\" (UniqueName: \"kubernetes.io/projected/1252848b-7af8-4167-aa0d-10efdc284e40-kube-api-access-f8frl\") pod \"auto-csr-approver-29552616-v5mqc\" (UID: \"1252848b-7af8-4167-aa0d-10efdc284e40\") " pod="openshift-infra/auto-csr-approver-29552616-v5mqc" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.426928 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8frl\" (UniqueName: \"kubernetes.io/projected/1252848b-7af8-4167-aa0d-10efdc284e40-kube-api-access-f8frl\") pod \"auto-csr-approver-29552616-v5mqc\" (UID: \"1252848b-7af8-4167-aa0d-10efdc284e40\") " pod="openshift-infra/auto-csr-approver-29552616-v5mqc" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.493962 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552616-v5mqc" Mar 10 15:36:00 crc kubenswrapper[4743]: I0310 15:36:00.950792 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552616-v5mqc"] Mar 10 15:36:01 crc kubenswrapper[4743]: I0310 15:36:01.231624 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552616-v5mqc" event={"ID":"1252848b-7af8-4167-aa0d-10efdc284e40","Type":"ContainerStarted","Data":"7ecd2f0eaecc1ddf3bbd1ac4a4c3a5abd7323456b2f0df7d5f3b080693e43b30"} Mar 10 15:36:02 crc kubenswrapper[4743]: I0310 15:36:02.049211 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8kq9l"] Mar 10 15:36:02 crc kubenswrapper[4743]: I0310 15:36:02.062050 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8kq9l"] Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.047210 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8746-account-create-update-v57t2"] Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.056702 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a335-account-create-update-bh55z"] Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.065914 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a335-account-create-update-bh55z"] Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.075026 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8746-account-create-update-v57t2"] Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.083237 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vvwgs"] Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.091098 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vvwgs"] Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.257072 4743 generic.go:334] "Generic (PLEG): container finished" podID="1252848b-7af8-4167-aa0d-10efdc284e40" containerID="b91085497b22c7f17b848c338c16f1870cf5d1dbc17b27c8adc3e9781a2d57b3" exitCode=0 Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.257133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552616-v5mqc" event={"ID":"1252848b-7af8-4167-aa0d-10efdc284e40","Type":"ContainerDied","Data":"b91085497b22c7f17b848c338c16f1870cf5d1dbc17b27c8adc3e9781a2d57b3"} Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.916116 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:36:03 crc kubenswrapper[4743]: E0310 15:36:03.916511 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.930140 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806" path="/var/lib/kubelet/pods/3c7c4eb3-7d2a-4e4e-a515-e3177ac5c806/volumes" Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.931954 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc89d4c-f23b-4795-bfcd-96318a266339" path="/var/lib/kubelet/pods/7bc89d4c-f23b-4795-bfcd-96318a266339/volumes" Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.932931 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74cb34e-ac59-473e-aa30-2c148a81e0ea" path="/var/lib/kubelet/pods/a74cb34e-ac59-473e-aa30-2c148a81e0ea/volumes" Mar 10 15:36:03 crc kubenswrapper[4743]: I0310 15:36:03.934450 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c53268-9723-45cc-b49b-068b245ea223" path="/var/lib/kubelet/pods/b8c53268-9723-45cc-b49b-068b245ea223/volumes" Mar 10 15:36:04 crc kubenswrapper[4743]: I0310 15:36:04.044958 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-bz2n7"] Mar 10 15:36:04 crc kubenswrapper[4743]: I0310 15:36:04.071973 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-bz2n7"] Mar 10 15:36:04 crc kubenswrapper[4743]: I0310 15:36:04.072048 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ada4-account-create-update-vgjvq"] Mar 10 15:36:04 crc kubenswrapper[4743]: I0310 15:36:04.072063 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ada4-account-create-update-vgjvq"] Mar 10 15:36:04 crc kubenswrapper[4743]: I0310 15:36:04.634204 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552616-v5mqc" Mar 10 15:36:04 crc kubenswrapper[4743]: I0310 15:36:04.744785 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8frl\" (UniqueName: \"kubernetes.io/projected/1252848b-7af8-4167-aa0d-10efdc284e40-kube-api-access-f8frl\") pod \"1252848b-7af8-4167-aa0d-10efdc284e40\" (UID: \"1252848b-7af8-4167-aa0d-10efdc284e40\") " Mar 10 15:36:04 crc kubenswrapper[4743]: I0310 15:36:04.749556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1252848b-7af8-4167-aa0d-10efdc284e40-kube-api-access-f8frl" (OuterVolumeSpecName: "kube-api-access-f8frl") pod "1252848b-7af8-4167-aa0d-10efdc284e40" (UID: "1252848b-7af8-4167-aa0d-10efdc284e40"). InnerVolumeSpecName "kube-api-access-f8frl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:36:04 crc kubenswrapper[4743]: I0310 15:36:04.847590 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8frl\" (UniqueName: \"kubernetes.io/projected/1252848b-7af8-4167-aa0d-10efdc284e40-kube-api-access-f8frl\") on node \"crc\" DevicePath \"\"" Mar 10 15:36:05 crc kubenswrapper[4743]: I0310 15:36:05.280568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552616-v5mqc" event={"ID":"1252848b-7af8-4167-aa0d-10efdc284e40","Type":"ContainerDied","Data":"7ecd2f0eaecc1ddf3bbd1ac4a4c3a5abd7323456b2f0df7d5f3b080693e43b30"} Mar 10 15:36:05 crc kubenswrapper[4743]: I0310 15:36:05.280613 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552616-v5mqc" Mar 10 15:36:05 crc kubenswrapper[4743]: I0310 15:36:05.280628 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ecd2f0eaecc1ddf3bbd1ac4a4c3a5abd7323456b2f0df7d5f3b080693e43b30" Mar 10 15:36:05 crc kubenswrapper[4743]: I0310 15:36:05.689076 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552610-k7w8j"] Mar 10 15:36:05 crc kubenswrapper[4743]: I0310 15:36:05.697256 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552610-k7w8j"] Mar 10 15:36:05 crc kubenswrapper[4743]: I0310 15:36:05.927133 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c8a28f7-44f1-4871-bff9-1d64242a7f5e" path="/var/lib/kubelet/pods/7c8a28f7-44f1-4871-bff9-1d64242a7f5e/volumes" Mar 10 15:36:05 crc kubenswrapper[4743]: I0310 15:36:05.928546 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f8c087-774e-49b4-b660-d3c4d9b72061" path="/var/lib/kubelet/pods/c5f8c087-774e-49b4-b660-d3c4d9b72061/volumes" Mar 10 15:36:05 crc kubenswrapper[4743]: I0310 15:36:05.929145 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cbf843-671d-45e8-8844-f1c0c3d9e099" path="/var/lib/kubelet/pods/e6cbf843-671d-45e8-8844-f1c0c3d9e099/volumes" Mar 10 15:36:18 crc kubenswrapper[4743]: I0310 15:36:18.915678 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:36:18 crc kubenswrapper[4743]: E0310 15:36:18.916465 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:36:21 crc kubenswrapper[4743]: I0310 15:36:21.046229 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9ltml"] Mar 10 15:36:21 crc kubenswrapper[4743]: I0310 15:36:21.066396 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9ltml"] Mar 10 15:36:21 crc kubenswrapper[4743]: I0310 15:36:21.933143 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d1127c-77f2-479e-8361-6d0736eee46b" path="/var/lib/kubelet/pods/f5d1127c-77f2-479e-8361-6d0736eee46b/volumes" Mar 10 15:36:28 crc kubenswrapper[4743]: I0310 15:36:28.031689 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zrnmr"] Mar 10 15:36:28 crc kubenswrapper[4743]: I0310 15:36:28.044675 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zrnmr"] Mar 10 15:36:29 crc kubenswrapper[4743]: I0310 15:36:29.930640 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496f6307-7603-4bfb-8524-86fd78005b43" path="/var/lib/kubelet/pods/496f6307-7603-4bfb-8524-86fd78005b43/volumes" Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.041247 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ng46f"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.051675 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8d7e-account-create-update-8j226"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.065029 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-n25pb"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.075415 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-n25pb"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.085510 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zm929"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.096952 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8d7e-account-create-update-8j226"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.104567 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ng46f"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.112473 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-2rq5t"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.120235 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zm929"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.128026 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-2rq5t"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.135382 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2567-account-create-update-vfqwd"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.143765 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-0647-account-create-update-wpz7z"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.152892 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2567-account-create-update-vfqwd"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.161355 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-0647-account-create-update-wpz7z"] Mar 10 15:36:32 crc kubenswrapper[4743]: I0310 15:36:32.915372 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:36:32 crc kubenswrapper[4743]: E0310 15:36:32.915660 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:36:33 crc kubenswrapper[4743]: I0310 15:36:33.931700 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1124ae-ce38-4379-b981-cb509dc25ce7" path="/var/lib/kubelet/pods/2f1124ae-ce38-4379-b981-cb509dc25ce7/volumes" Mar 10 15:36:33 crc kubenswrapper[4743]: I0310 15:36:33.932770 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43962c3a-61ca-4e7c-b56c-d9eefcaddeee" path="/var/lib/kubelet/pods/43962c3a-61ca-4e7c-b56c-d9eefcaddeee/volumes" Mar 10 15:36:33 crc kubenswrapper[4743]: I0310 15:36:33.933653 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5505a3d8-7971-4ab1-9d50-def772d62890" path="/var/lib/kubelet/pods/5505a3d8-7971-4ab1-9d50-def772d62890/volumes" Mar 10 15:36:33 crc kubenswrapper[4743]: I0310 15:36:33.934447 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700303a1-61db-4669-9dba-31bba76aa8a5" path="/var/lib/kubelet/pods/700303a1-61db-4669-9dba-31bba76aa8a5/volumes" Mar 10 15:36:33 crc kubenswrapper[4743]: I0310 15:36:33.936183 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c6da81-8a78-4383-aa08-580c5242a582" path="/var/lib/kubelet/pods/82c6da81-8a78-4383-aa08-580c5242a582/volumes" Mar 10 15:36:33 crc kubenswrapper[4743]: I0310 15:36:33.936955 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6fe554d-63d8-4ba6-948b-d8658db57faa" path="/var/lib/kubelet/pods/a6fe554d-63d8-4ba6-948b-d8658db57faa/volumes" Mar 10 15:36:33 crc kubenswrapper[4743]: I0310 15:36:33.937737 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba22cc82-1acd-4b8b-b7a8-3d598262b490" path="/var/lib/kubelet/pods/ba22cc82-1acd-4b8b-b7a8-3d598262b490/volumes" Mar 10 15:36:35 crc kubenswrapper[4743]: I0310 15:36:35.056158 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1bfd-account-create-update-5nmxw"] Mar 10 15:36:35 crc kubenswrapper[4743]: I0310 15:36:35.069111 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1bfd-account-create-update-5nmxw"] Mar 10 15:36:35 crc kubenswrapper[4743]: I0310 15:36:35.940934 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b11c74-122b-4da8-a2e2-040fab849d4b" path="/var/lib/kubelet/pods/45b11c74-122b-4da8-a2e2-040fab849d4b/volumes" Mar 10 15:36:40 crc kubenswrapper[4743]: I0310 15:36:40.031866 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7mqd4"] Mar 10 15:36:40 crc kubenswrapper[4743]: I0310 15:36:40.042039 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7mqd4"] Mar 10 15:36:41 crc kubenswrapper[4743]: I0310 15:36:41.930143 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e25cc1-8829-47fa-8a68-9968a7ba8e75" path="/var/lib/kubelet/pods/49e25cc1-8829-47fa-8a68-9968a7ba8e75/volumes" Mar 10 15:36:46 crc kubenswrapper[4743]: I0310 15:36:46.916449 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:36:46 crc kubenswrapper[4743]: E0310 15:36:46.917711 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.267899 4743 scope.go:117] "RemoveContainer" containerID="a2d3367b0007bc5df8892fc90c19e0dc1af34864bb7630c2d95926468bb1659d" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.299473 4743 scope.go:117] "RemoveContainer" containerID="c0bc65b83909e55b7c6036d7051811bb7a0bc331c0a7957b851b97054df50db3" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.365241 4743 scope.go:117] "RemoveContainer" containerID="978071ec4fd79d6c2c9c0233a5371cf76049c7f9464452f18d9c6f3ffb2be1df" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.445388 4743 scope.go:117] "RemoveContainer" containerID="300da71cb91d0ac1a2302b90060f8f7de6cafc2774f32758184775b8fe3a4dd0" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.468686 4743 scope.go:117] "RemoveContainer" containerID="63e08faef298f1ca013fc2eb3b8539599ba6cd5f8a29056bf9a00ae5cb615025" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.507937 4743 scope.go:117] "RemoveContainer" containerID="84a396c96dc27e777c7f9314a1540f6da393210e011f6a9974e57b005fcaa3a5" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.558922 4743 scope.go:117] "RemoveContainer" containerID="305b046085f83c938e43b5e2730abb0fb27bea091b6845ae349b0f5afe67872a" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.586589 4743 scope.go:117] "RemoveContainer" containerID="9dbe58b8ead8597d746676b37427f444f7b3bbfc54ff179db882a8a5efd31812" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.619839 4743 scope.go:117] "RemoveContainer" containerID="069cd101a1566d81e82a72fbc921a8de11938c3ea8594382fdfeac009bc71493" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.662133 4743 scope.go:117] "RemoveContainer" containerID="237cfa3f4796c9a423cf4553297b98768d36c0552b15995398fff3d8f1578a1f" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.696016 4743 scope.go:117] "RemoveContainer" containerID="8c8f0d92e486410a9d69d1c785aeafdd6070e8e2f2f256afd68a7e43cc265b33" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.733988 4743 scope.go:117] "RemoveContainer" containerID="0b6b14849e9c1267474dd716845908ea7eac67f43f201fe03212fd3307a38fed" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.796679 4743 scope.go:117] "RemoveContainer" containerID="455a255e318e3164ee52d227a1eb2481f1f6683044946c4e16ddfb8dfe57566e" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.872776 4743 scope.go:117] "RemoveContainer" containerID="cda9dd23595e75df463f0b79861831f21628bf5b35e2fa35e8042fb3d2251ac7" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.909074 4743 scope.go:117] "RemoveContainer" containerID="1ff7c6e5e6b8c6b0512b5575c96c5dd4d9388e3bb09a106006ce83f5f293c4bb" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.930342 4743 scope.go:117] "RemoveContainer" containerID="5693e71c8d0e859edaee1a6980c2dacd9bfea1a3fa8cd297dbf6d60811575a1c" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.955407 4743 scope.go:117] "RemoveContainer" containerID="ef22e1193204539d41cdc97b630e58390c221111525c11cb1d646bfcc7d278b1" Mar 10 15:36:53 crc kubenswrapper[4743]: I0310 15:36:53.979397 4743 scope.go:117] "RemoveContainer" containerID="a2db8459c8d99f4e1accffe0b6e2e05db562910b5566931f52012c1f77126af5" Mar 10 15:36:59 crc kubenswrapper[4743]: I0310 15:36:59.961418 4743 generic.go:334] "Generic (PLEG): container finished" podID="a14f160d-bf98-45da-a719-d8937e9281b0" containerID="cc095ed95f535f37ee5c1b1d4a537e809efb83c38ea4051eb57736cf6becb59f" exitCode=0 Mar 10 15:36:59 crc kubenswrapper[4743]: I0310 15:36:59.961540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" event={"ID":"a14f160d-bf98-45da-a719-d8937e9281b0","Type":"ContainerDied","Data":"cc095ed95f535f37ee5c1b1d4a537e809efb83c38ea4051eb57736cf6becb59f"} Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.490113 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.565751 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlthq\" (UniqueName: \"kubernetes.io/projected/a14f160d-bf98-45da-a719-d8937e9281b0-kube-api-access-qlthq\") pod \"a14f160d-bf98-45da-a719-d8937e9281b0\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.565937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-inventory\") pod \"a14f160d-bf98-45da-a719-d8937e9281b0\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.565975 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-ssh-key-openstack-edpm-ipam\") pod \"a14f160d-bf98-45da-a719-d8937e9281b0\" (UID: \"a14f160d-bf98-45da-a719-d8937e9281b0\") " Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.572200 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14f160d-bf98-45da-a719-d8937e9281b0-kube-api-access-qlthq" (OuterVolumeSpecName: "kube-api-access-qlthq") pod "a14f160d-bf98-45da-a719-d8937e9281b0" (UID: "a14f160d-bf98-45da-a719-d8937e9281b0"). InnerVolumeSpecName "kube-api-access-qlthq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.595411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a14f160d-bf98-45da-a719-d8937e9281b0" (UID: "a14f160d-bf98-45da-a719-d8937e9281b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.614300 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-inventory" (OuterVolumeSpecName: "inventory") pod "a14f160d-bf98-45da-a719-d8937e9281b0" (UID: "a14f160d-bf98-45da-a719-d8937e9281b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.669100 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlthq\" (UniqueName: \"kubernetes.io/projected/a14f160d-bf98-45da-a719-d8937e9281b0-kube-api-access-qlthq\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.669150 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.669162 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14f160d-bf98-45da-a719-d8937e9281b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.915803 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:37:01 crc kubenswrapper[4743]: E0310 15:37:01.916237 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.984553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" event={"ID":"a14f160d-bf98-45da-a719-d8937e9281b0","Type":"ContainerDied","Data":"9f10ca954a040c407e248894fe486f4e282e434075c11051113926bd0b73edca"} Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.984597 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f10ca954a040c407e248894fe486f4e282e434075c11051113926bd0b73edca" Mar 10 15:37:01 crc kubenswrapper[4743]: I0310 15:37:01.984653 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6k57c" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.087271 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw"] Mar 10 15:37:02 crc kubenswrapper[4743]: E0310 15:37:02.087741 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1252848b-7af8-4167-aa0d-10efdc284e40" containerName="oc" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.087759 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1252848b-7af8-4167-aa0d-10efdc284e40" containerName="oc" Mar 10 15:37:02 crc kubenswrapper[4743]: E0310 15:37:02.087789 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14f160d-bf98-45da-a719-d8937e9281b0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.087797 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14f160d-bf98-45da-a719-d8937e9281b0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.087998 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14f160d-bf98-45da-a719-d8937e9281b0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.088016 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1252848b-7af8-4167-aa0d-10efdc284e40" containerName="oc" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.088708 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.092325 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.092336 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.093029 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.093531 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.097238 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw"] Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.178981 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.179101 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.179384 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx7gm\" (UniqueName: \"kubernetes.io/projected/c1c83696-4c97-45ea-be7a-f635b349da0b-kube-api-access-gx7gm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.281020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx7gm\" (UniqueName: \"kubernetes.io/projected/c1c83696-4c97-45ea-be7a-f635b349da0b-kube-api-access-gx7gm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.281166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.281199 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.290710 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.290762 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.308538 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx7gm\" (UniqueName: \"kubernetes.io/projected/c1c83696-4c97-45ea-be7a-f635b349da0b-kube-api-access-gx7gm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:02 crc kubenswrapper[4743]: I0310 15:37:02.418542 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:37:03 crc kubenswrapper[4743]: I0310 15:37:03.007706 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw"] Mar 10 15:37:03 crc kubenswrapper[4743]: W0310 15:37:03.012946 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c83696_4c97_45ea_be7a_f635b349da0b.slice/crio-28c388d2c9c2a9bc47f8b056bff74a895d4f5c0555a7a8166e11540b46328182 WatchSource:0}: Error finding container 28c388d2c9c2a9bc47f8b056bff74a895d4f5c0555a7a8166e11540b46328182: Status 404 returned error can't find the container with id 28c388d2c9c2a9bc47f8b056bff74a895d4f5c0555a7a8166e11540b46328182 Mar 10 15:37:03 crc kubenswrapper[4743]: I0310 15:37:03.017359 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:37:04 crc kubenswrapper[4743]: I0310 15:37:04.066110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" event={"ID":"c1c83696-4c97-45ea-be7a-f635b349da0b","Type":"ContainerStarted","Data":"863c0a403f94c6b9398021a2a326899f4ee1a1887773aae06f1eab16e828e7dd"} Mar 10 15:37:04 crc kubenswrapper[4743]: I0310 15:37:04.066630 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" event={"ID":"c1c83696-4c97-45ea-be7a-f635b349da0b","Type":"ContainerStarted","Data":"28c388d2c9c2a9bc47f8b056bff74a895d4f5c0555a7a8166e11540b46328182"} Mar 10 15:37:04 crc kubenswrapper[4743]: I0310 15:37:04.105355 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" podStartSLOduration=1.689040926 podStartE2EDuration="2.10533391s" podCreationTimestamp="2026-03-10 15:37:02 +0000 UTC" firstStartedPulling="2026-03-10 15:37:03.01704681 +0000 UTC m=+1887.723861558" lastFinishedPulling="2026-03-10 15:37:03.433339794 +0000 UTC m=+1888.140154542" observedRunningTime="2026-03-10 15:37:04.097846477 +0000 UTC m=+1888.804661225" watchObservedRunningTime="2026-03-10 15:37:04.10533391 +0000 UTC m=+1888.812148658" Mar 10 15:37:13 crc kubenswrapper[4743]: I0310 15:37:13.057552 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4hbxs"] Mar 10 15:37:13 crc kubenswrapper[4743]: I0310 15:37:13.066388 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4hbxs"] Mar 10 15:37:13 crc kubenswrapper[4743]: I0310 15:37:13.926676 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a25471-c13d-434e-9f74-82de0cd19099" path="/var/lib/kubelet/pods/88a25471-c13d-434e-9f74-82de0cd19099/volumes" Mar 10 15:37:15 crc kubenswrapper[4743]: I0310 15:37:15.929604 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:37:15 crc kubenswrapper[4743]: E0310 15:37:15.930229 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:37:21 crc kubenswrapper[4743]: I0310 15:37:21.042630 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-lc42v"] Mar 10 15:37:21 crc kubenswrapper[4743]: I0310 15:37:21.055654 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-lc42v"] Mar 10 15:37:21 crc kubenswrapper[4743]: I0310 15:37:21.930894 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef9806a-40c1-468d-92d8-70e92819f27b" path="/var/lib/kubelet/pods/8ef9806a-40c1-468d-92d8-70e92819f27b/volumes" Mar 10 15:37:27 crc kubenswrapper[4743]: I0310 15:37:27.916054 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:37:27 crc kubenswrapper[4743]: E0310 15:37:27.917034 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:37:31 crc kubenswrapper[4743]: I0310 15:37:31.051400 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-z94q2"] Mar 10 15:37:31 crc kubenswrapper[4743]: I0310 15:37:31.066635 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8fnbh"] Mar 10 15:37:31 crc kubenswrapper[4743]: I0310 15:37:31.080616 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8fnbh"] Mar 10 15:37:31 crc kubenswrapper[4743]: I0310 15:37:31.090753 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-z94q2"] Mar 10 15:37:31 crc kubenswrapper[4743]: I0310 15:37:31.935144 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d812f1-305a-40e9-b8ff-51b5e640ca57" path="/var/lib/kubelet/pods/57d812f1-305a-40e9-b8ff-51b5e640ca57/volumes" Mar 10 15:37:31 crc kubenswrapper[4743]: I0310 15:37:31.935699 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ff1eff-8355-40a9-b02d-cfb47e08bb46" path="/var/lib/kubelet/pods/d7ff1eff-8355-40a9-b02d-cfb47e08bb46/volumes" Mar 10 15:37:39 crc kubenswrapper[4743]: I0310 15:37:39.917101 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:37:39 crc kubenswrapper[4743]: E0310 15:37:39.918554 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.467860 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4wsjl"] Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.470343 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.483932 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4wsjl"] Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.569489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwhbq\" (UniqueName: \"kubernetes.io/projected/99c09abd-da9f-469b-b9e4-626fdaacb4f8-kube-api-access-fwhbq\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.569968 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-utilities\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.570111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-catalog-content\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.672190 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwhbq\" (UniqueName: \"kubernetes.io/projected/99c09abd-da9f-469b-b9e4-626fdaacb4f8-kube-api-access-fwhbq\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.672345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-utilities\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.672379 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-catalog-content\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.672967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-catalog-content\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.673420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-utilities\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.694687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwhbq\" (UniqueName: \"kubernetes.io/projected/99c09abd-da9f-469b-b9e4-626fdaacb4f8-kube-api-access-fwhbq\") pod \"certified-operators-4wsjl\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:40 crc kubenswrapper[4743]: I0310 15:37:40.803104 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:41 crc kubenswrapper[4743]: I0310 15:37:41.300826 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4wsjl"] Mar 10 15:37:41 crc kubenswrapper[4743]: I0310 15:37:41.477224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wsjl" event={"ID":"99c09abd-da9f-469b-b9e4-626fdaacb4f8","Type":"ContainerStarted","Data":"2294f93b1f0a47739774b4477b82b387478a76d1cf1ff08de07809eba7ef2738"} Mar 10 15:37:42 crc kubenswrapper[4743]: I0310 15:37:42.489447 4743 generic.go:334] "Generic (PLEG): container finished" podID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerID="1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee" exitCode=0 Mar 10 15:37:42 crc kubenswrapper[4743]: I0310 15:37:42.489712 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wsjl" event={"ID":"99c09abd-da9f-469b-b9e4-626fdaacb4f8","Type":"ContainerDied","Data":"1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee"} Mar 10 15:37:44 crc kubenswrapper[4743]: I0310 15:37:44.510051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wsjl" event={"ID":"99c09abd-da9f-469b-b9e4-626fdaacb4f8","Type":"ContainerStarted","Data":"3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb"} Mar 10 15:37:45 crc kubenswrapper[4743]: I0310 15:37:45.526927 4743 generic.go:334] "Generic (PLEG): container finished" podID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerID="3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb" exitCode=0 Mar 10 15:37:45 crc kubenswrapper[4743]: I0310 15:37:45.527005 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wsjl" event={"ID":"99c09abd-da9f-469b-b9e4-626fdaacb4f8","Type":"ContainerDied","Data":"3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb"} Mar 10 15:37:47 crc kubenswrapper[4743]: I0310 15:37:47.550477 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wsjl" event={"ID":"99c09abd-da9f-469b-b9e4-626fdaacb4f8","Type":"ContainerStarted","Data":"e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c"} Mar 10 15:37:47 crc kubenswrapper[4743]: I0310 15:37:47.583748 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4wsjl" podStartSLOduration=3.71728033 podStartE2EDuration="7.583724136s" podCreationTimestamp="2026-03-10 15:37:40 +0000 UTC" firstStartedPulling="2026-03-10 15:37:42.492029094 +0000 UTC m=+1927.198843842" lastFinishedPulling="2026-03-10 15:37:46.35847289 +0000 UTC m=+1931.065287648" observedRunningTime="2026-03-10 15:37:47.570305524 +0000 UTC m=+1932.277120282" watchObservedRunningTime="2026-03-10 15:37:47.583724136 +0000 UTC m=+1932.290538894" Mar 10 15:37:49 crc kubenswrapper[4743]: I0310 15:37:49.051714 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-86f6w"] Mar 10 15:37:49 crc kubenswrapper[4743]: I0310 15:37:49.067124 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-86f6w"] Mar 10 15:37:49 crc kubenswrapper[4743]: I0310 15:37:49.934670 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a99e3b-6d76-485c-b284-5f275ba9bbef" path="/var/lib/kubelet/pods/68a99e3b-6d76-485c-b284-5f275ba9bbef/volumes" Mar 10 15:37:50 crc kubenswrapper[4743]: I0310 15:37:50.803892 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:50 crc kubenswrapper[4743]: I0310 15:37:50.804918 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:50 crc kubenswrapper[4743]: I0310 15:37:50.891316 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:51 crc kubenswrapper[4743]: I0310 15:37:51.684987 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:51 crc kubenswrapper[4743]: I0310 15:37:51.915777 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:37:52 crc kubenswrapper[4743]: I0310 15:37:52.037699 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4wsjl"] Mar 10 15:37:52 crc kubenswrapper[4743]: I0310 15:37:52.630114 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"af106cec6bede362dcfa7f972c69b1f3686e1c5bf8c7d44260bf5cc40751d829"} Mar 10 15:37:53 crc kubenswrapper[4743]: I0310 15:37:53.642397 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4wsjl" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerName="registry-server" containerID="cri-o://e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c" gracePeriod=2 Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.045614 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-lgtjg"] Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.059984 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-lgtjg"] Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.164881 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.298484 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwhbq\" (UniqueName: \"kubernetes.io/projected/99c09abd-da9f-469b-b9e4-626fdaacb4f8-kube-api-access-fwhbq\") pod \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.298616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-catalog-content\") pod \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.298700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-utilities\") pod \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\" (UID: \"99c09abd-da9f-469b-b9e4-626fdaacb4f8\") " Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.300592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-utilities" (OuterVolumeSpecName: "utilities") pod "99c09abd-da9f-469b-b9e4-626fdaacb4f8" (UID: "99c09abd-da9f-469b-b9e4-626fdaacb4f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.305934 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c09abd-da9f-469b-b9e4-626fdaacb4f8-kube-api-access-fwhbq" (OuterVolumeSpecName: "kube-api-access-fwhbq") pod "99c09abd-da9f-469b-b9e4-626fdaacb4f8" (UID: "99c09abd-da9f-469b-b9e4-626fdaacb4f8"). InnerVolumeSpecName "kube-api-access-fwhbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.370004 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99c09abd-da9f-469b-b9e4-626fdaacb4f8" (UID: "99c09abd-da9f-469b-b9e4-626fdaacb4f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.384947 4743 scope.go:117] "RemoveContainer" containerID="d3dc28fde989f285e2d3ddd3cb9f48e1356b0d71457185b391360afe8cbe343f" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.417420 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.417487 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c09abd-da9f-469b-b9e4-626fdaacb4f8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.417503 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwhbq\" (UniqueName: \"kubernetes.io/projected/99c09abd-da9f-469b-b9e4-626fdaacb4f8-kube-api-access-fwhbq\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.446593 4743 scope.go:117] "RemoveContainer" containerID="fe10c62081917d54d95e30624b900758b372fc23fe1e0af34141218b0944e109" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.482634 4743 scope.go:117] "RemoveContainer" containerID="8ffbb79c1ba7553c23c09ecc5886e0f9abbe9773800fc7878bbf8fcde3f4c2de" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.526906 4743 scope.go:117] "RemoveContainer" containerID="38fa0812a7aade8e46c137973d40837bc5097536eaa054526cfa660d01b7eb39" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.562934 4743 scope.go:117] "RemoveContainer" containerID="75244dab8ce6de7ad77790db8fd0420d4fc9fa48f1073a66c7f739ab78cfa955" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.665302 4743 generic.go:334] "Generic (PLEG): container finished" podID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerID="e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c" exitCode=0 Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.665359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wsjl" event={"ID":"99c09abd-da9f-469b-b9e4-626fdaacb4f8","Type":"ContainerDied","Data":"e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c"} Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.665413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wsjl" event={"ID":"99c09abd-da9f-469b-b9e4-626fdaacb4f8","Type":"ContainerDied","Data":"2294f93b1f0a47739774b4477b82b387478a76d1cf1ff08de07809eba7ef2738"} Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.665415 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wsjl" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.665431 4743 scope.go:117] "RemoveContainer" containerID="e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.704885 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4wsjl"] Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.717319 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4wsjl"] Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.719660 4743 scope.go:117] "RemoveContainer" containerID="3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.746971 4743 scope.go:117] "RemoveContainer" containerID="1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.774263 4743 scope.go:117] "RemoveContainer" containerID="e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c" Mar 10 15:37:54 crc kubenswrapper[4743]: E0310 15:37:54.774685 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c\": container with ID starting with e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c not found: ID does not exist" containerID="e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.774724 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c"} err="failed to get container status \"e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c\": rpc error: code = NotFound desc = could not find container \"e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c\": container with ID starting with e15265b0f890d10c7f73bff9457c0f877bd0bcc4874bb0edad61a4d52402dc4c not found: ID does not exist" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.774773 4743 scope.go:117] "RemoveContainer" containerID="3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb" Mar 10 15:37:54 crc kubenswrapper[4743]: E0310 15:37:54.775336 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb\": container with ID starting with 3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb not found: ID does not exist" containerID="3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.775444 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb"} err="failed to get container status \"3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb\": rpc error: code = NotFound desc = could not find container \"3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb\": container with ID starting with 3dd0010b45f166f3a3fea2116fd63c14c4fb1db71ba92f30094b333710ccb8cb not found: ID does not exist" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.775511 4743 scope.go:117] "RemoveContainer" containerID="1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee" Mar 10 15:37:54 crc kubenswrapper[4743]: E0310 15:37:54.776099 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee\": container with ID starting with 1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee not found: ID does not exist" containerID="1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee" Mar 10 15:37:54 crc kubenswrapper[4743]: I0310 15:37:54.776143 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee"} err="failed to get container status \"1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee\": rpc error: code = NotFound desc = could not find container \"1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee\": container with ID starting with 1728fadc033ab340bed374472aae63dbf4cbed845083ec215afcfba927ac30ee not found: ID does not exist" Mar 10 15:37:55 crc kubenswrapper[4743]: I0310 15:37:55.936364 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45477f7e-f216-40fb-acdb-d7a1dbadba99" path="/var/lib/kubelet/pods/45477f7e-f216-40fb-acdb-d7a1dbadba99/volumes" Mar 10 15:37:55 crc kubenswrapper[4743]: I0310 15:37:55.938377 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" path="/var/lib/kubelet/pods/99c09abd-da9f-469b-b9e4-626fdaacb4f8/volumes" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.178136 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552618-hrmvq"] Mar 10 15:38:00 crc kubenswrapper[4743]: E0310 15:38:00.179485 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerName="registry-server" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.179514 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerName="registry-server" Mar 10 15:38:00 crc kubenswrapper[4743]: E0310 15:38:00.179540 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerName="extract-content" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.179554 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerName="extract-content" Mar 10 15:38:00 crc kubenswrapper[4743]: E0310 15:38:00.179584 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerName="extract-utilities" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.179601 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerName="extract-utilities" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.180031 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c09abd-da9f-469b-b9e4-626fdaacb4f8" containerName="registry-server" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.181144 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552618-hrmvq" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.184069 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.184267 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.184799 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.192294 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552618-hrmvq"] Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.258138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlqlm\" (UniqueName: \"kubernetes.io/projected/85dfcae8-b723-4e7c-9686-25f8c2b71a4c-kube-api-access-zlqlm\") pod \"auto-csr-approver-29552618-hrmvq\" (UID: \"85dfcae8-b723-4e7c-9686-25f8c2b71a4c\") " pod="openshift-infra/auto-csr-approver-29552618-hrmvq" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.359991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlqlm\" (UniqueName: \"kubernetes.io/projected/85dfcae8-b723-4e7c-9686-25f8c2b71a4c-kube-api-access-zlqlm\") pod \"auto-csr-approver-29552618-hrmvq\" (UID: \"85dfcae8-b723-4e7c-9686-25f8c2b71a4c\") " pod="openshift-infra/auto-csr-approver-29552618-hrmvq" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.385687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlqlm\" (UniqueName: \"kubernetes.io/projected/85dfcae8-b723-4e7c-9686-25f8c2b71a4c-kube-api-access-zlqlm\") pod \"auto-csr-approver-29552618-hrmvq\" (UID: \"85dfcae8-b723-4e7c-9686-25f8c2b71a4c\") " pod="openshift-infra/auto-csr-approver-29552618-hrmvq" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.522045 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552618-hrmvq" Mar 10 15:38:00 crc kubenswrapper[4743]: I0310 15:38:00.973558 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552618-hrmvq"] Mar 10 15:38:01 crc kubenswrapper[4743]: I0310 15:38:01.747217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552618-hrmvq" event={"ID":"85dfcae8-b723-4e7c-9686-25f8c2b71a4c","Type":"ContainerStarted","Data":"be2ae1e6f1a0a60053e36ce45602b461198c0a4c0a1b88c5e46cf1059fb7e1ea"} Mar 10 15:38:02 crc kubenswrapper[4743]: I0310 15:38:02.757202 4743 generic.go:334] "Generic (PLEG): container finished" podID="85dfcae8-b723-4e7c-9686-25f8c2b71a4c" containerID="e4a836464e6c7c8129fe4a476121c268334b2a07a7c7893af23b9029b67787ef" exitCode=0 Mar 10 15:38:02 crc kubenswrapper[4743]: I0310 15:38:02.757314 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552618-hrmvq" event={"ID":"85dfcae8-b723-4e7c-9686-25f8c2b71a4c","Type":"ContainerDied","Data":"e4a836464e6c7c8129fe4a476121c268334b2a07a7c7893af23b9029b67787ef"} Mar 10 15:38:04 crc kubenswrapper[4743]: I0310 15:38:04.175981 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552618-hrmvq" Mar 10 15:38:04 crc kubenswrapper[4743]: I0310 15:38:04.246078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlqlm\" (UniqueName: \"kubernetes.io/projected/85dfcae8-b723-4e7c-9686-25f8c2b71a4c-kube-api-access-zlqlm\") pod \"85dfcae8-b723-4e7c-9686-25f8c2b71a4c\" (UID: \"85dfcae8-b723-4e7c-9686-25f8c2b71a4c\") " Mar 10 15:38:04 crc kubenswrapper[4743]: I0310 15:38:04.255039 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85dfcae8-b723-4e7c-9686-25f8c2b71a4c-kube-api-access-zlqlm" (OuterVolumeSpecName: "kube-api-access-zlqlm") pod "85dfcae8-b723-4e7c-9686-25f8c2b71a4c" (UID: "85dfcae8-b723-4e7c-9686-25f8c2b71a4c"). InnerVolumeSpecName "kube-api-access-zlqlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:38:04 crc kubenswrapper[4743]: I0310 15:38:04.348704 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlqlm\" (UniqueName: \"kubernetes.io/projected/85dfcae8-b723-4e7c-9686-25f8c2b71a4c-kube-api-access-zlqlm\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:04 crc kubenswrapper[4743]: I0310 15:38:04.785908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552618-hrmvq" event={"ID":"85dfcae8-b723-4e7c-9686-25f8c2b71a4c","Type":"ContainerDied","Data":"be2ae1e6f1a0a60053e36ce45602b461198c0a4c0a1b88c5e46cf1059fb7e1ea"} Mar 10 15:38:04 crc kubenswrapper[4743]: I0310 15:38:04.786005 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be2ae1e6f1a0a60053e36ce45602b461198c0a4c0a1b88c5e46cf1059fb7e1ea" Mar 10 15:38:04 crc kubenswrapper[4743]: I0310 15:38:04.786124 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552618-hrmvq" Mar 10 15:38:05 crc kubenswrapper[4743]: I0310 15:38:05.262732 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552612-lq6pb"] Mar 10 15:38:05 crc kubenswrapper[4743]: I0310 15:38:05.270796 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552612-lq6pb"] Mar 10 15:38:05 crc kubenswrapper[4743]: I0310 15:38:05.934587 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0eafa7-a122-440d-981f-f5b873fe6255" path="/var/lib/kubelet/pods/ed0eafa7-a122-440d-981f-f5b873fe6255/volumes" Mar 10 15:38:11 crc kubenswrapper[4743]: E0310 15:38:11.242492 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c83696_4c97_45ea_be7a_f635b349da0b.slice/crio-863c0a403f94c6b9398021a2a326899f4ee1a1887773aae06f1eab16e828e7dd.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:38:11 crc kubenswrapper[4743]: I0310 15:38:11.884398 4743 generic.go:334] "Generic (PLEG): container finished" podID="c1c83696-4c97-45ea-be7a-f635b349da0b" containerID="863c0a403f94c6b9398021a2a326899f4ee1a1887773aae06f1eab16e828e7dd" exitCode=0 Mar 10 15:38:11 crc kubenswrapper[4743]: I0310 15:38:11.884536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" event={"ID":"c1c83696-4c97-45ea-be7a-f635b349da0b","Type":"ContainerDied","Data":"863c0a403f94c6b9398021a2a326899f4ee1a1887773aae06f1eab16e828e7dd"} Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.380209 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.492092 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx7gm\" (UniqueName: \"kubernetes.io/projected/c1c83696-4c97-45ea-be7a-f635b349da0b-kube-api-access-gx7gm\") pod \"c1c83696-4c97-45ea-be7a-f635b349da0b\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.492212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-inventory\") pod \"c1c83696-4c97-45ea-be7a-f635b349da0b\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.492281 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-ssh-key-openstack-edpm-ipam\") pod \"c1c83696-4c97-45ea-be7a-f635b349da0b\" (UID: \"c1c83696-4c97-45ea-be7a-f635b349da0b\") " Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.504186 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c83696-4c97-45ea-be7a-f635b349da0b-kube-api-access-gx7gm" (OuterVolumeSpecName: "kube-api-access-gx7gm") pod "c1c83696-4c97-45ea-be7a-f635b349da0b" (UID: "c1c83696-4c97-45ea-be7a-f635b349da0b"). InnerVolumeSpecName "kube-api-access-gx7gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.520034 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c1c83696-4c97-45ea-be7a-f635b349da0b" (UID: "c1c83696-4c97-45ea-be7a-f635b349da0b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.523041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-inventory" (OuterVolumeSpecName: "inventory") pod "c1c83696-4c97-45ea-be7a-f635b349da0b" (UID: "c1c83696-4c97-45ea-be7a-f635b349da0b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.595011 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx7gm\" (UniqueName: \"kubernetes.io/projected/c1c83696-4c97-45ea-be7a-f635b349da0b-kube-api-access-gx7gm\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.595045 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.595055 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1c83696-4c97-45ea-be7a-f635b349da0b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.907771 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" event={"ID":"c1c83696-4c97-45ea-be7a-f635b349da0b","Type":"ContainerDied","Data":"28c388d2c9c2a9bc47f8b056bff74a895d4f5c0555a7a8166e11540b46328182"} Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.907843 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28c388d2c9c2a9bc47f8b056bff74a895d4f5c0555a7a8166e11540b46328182" Mar 10 15:38:13 crc kubenswrapper[4743]: I0310 15:38:13.907878 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.007622 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf"] Mar 10 15:38:14 crc kubenswrapper[4743]: E0310 15:38:14.008446 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c83696-4c97-45ea-be7a-f635b349da0b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.008463 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c83696-4c97-45ea-be7a-f635b349da0b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:14 crc kubenswrapper[4743]: E0310 15:38:14.008511 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85dfcae8-b723-4e7c-9686-25f8c2b71a4c" containerName="oc" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.008519 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="85dfcae8-b723-4e7c-9686-25f8c2b71a4c" containerName="oc" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.008760 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="85dfcae8-b723-4e7c-9686-25f8c2b71a4c" containerName="oc" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.008778 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c83696-4c97-45ea-be7a-f635b349da0b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.009576 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.013672 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.013910 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.014095 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.014536 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.021896 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf"] Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.109329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.109379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d979\" (UniqueName: \"kubernetes.io/projected/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-kube-api-access-4d979\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.109428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.211519 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.211611 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d979\" (UniqueName: \"kubernetes.io/projected/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-kube-api-access-4d979\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.211648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.216548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.216837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.230443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d979\" (UniqueName: \"kubernetes.io/projected/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-kube-api-access-4d979\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.338550 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.748195 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf"] Mar 10 15:38:14 crc kubenswrapper[4743]: W0310 15:38:14.752056 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d2cd554_fedb_43fb_aefd_0f82a6c265e4.slice/crio-4b91082bbf60fda7d20577f0aadd0a580351c4cf2e843ea4b3988a3333498ff0 WatchSource:0}: Error finding container 4b91082bbf60fda7d20577f0aadd0a580351c4cf2e843ea4b3988a3333498ff0: Status 404 returned error can't find the container with id 4b91082bbf60fda7d20577f0aadd0a580351c4cf2e843ea4b3988a3333498ff0 Mar 10 15:38:14 crc kubenswrapper[4743]: I0310 15:38:14.920540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" event={"ID":"7d2cd554-fedb-43fb-aefd-0f82a6c265e4","Type":"ContainerStarted","Data":"4b91082bbf60fda7d20577f0aadd0a580351c4cf2e843ea4b3988a3333498ff0"} Mar 10 15:38:15 crc kubenswrapper[4743]: I0310 15:38:15.943435 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" event={"ID":"7d2cd554-fedb-43fb-aefd-0f82a6c265e4","Type":"ContainerStarted","Data":"a8b0b970369002d6eec5a8898536fc0859a7968ceb620c0f19ceed4bcee8cf4a"} Mar 10 15:38:15 crc kubenswrapper[4743]: I0310 15:38:15.987429 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" podStartSLOduration=2.246050934 podStartE2EDuration="2.987403435s" podCreationTimestamp="2026-03-10 15:38:13 +0000 UTC" firstStartedPulling="2026-03-10 15:38:14.755226042 +0000 UTC m=+1959.462040810" lastFinishedPulling="2026-03-10 15:38:15.496578563 +0000 UTC m=+1960.203393311" observedRunningTime="2026-03-10 15:38:15.975160917 +0000 UTC m=+1960.681975675" watchObservedRunningTime="2026-03-10 15:38:15.987403435 +0000 UTC m=+1960.694218203" Mar 10 15:38:21 crc kubenswrapper[4743]: I0310 15:38:21.003206 4743 generic.go:334] "Generic (PLEG): container finished" podID="7d2cd554-fedb-43fb-aefd-0f82a6c265e4" containerID="a8b0b970369002d6eec5a8898536fc0859a7968ceb620c0f19ceed4bcee8cf4a" exitCode=0 Mar 10 15:38:21 crc kubenswrapper[4743]: I0310 15:38:21.003298 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" event={"ID":"7d2cd554-fedb-43fb-aefd-0f82a6c265e4","Type":"ContainerDied","Data":"a8b0b970369002d6eec5a8898536fc0859a7968ceb620c0f19ceed4bcee8cf4a"} Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.468138 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.610239 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d979\" (UniqueName: \"kubernetes.io/projected/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-kube-api-access-4d979\") pod \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.610325 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-ssh-key-openstack-edpm-ipam\") pod \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.610432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-inventory\") pod \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\" (UID: \"7d2cd554-fedb-43fb-aefd-0f82a6c265e4\") " Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.620082 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-kube-api-access-4d979" (OuterVolumeSpecName: "kube-api-access-4d979") pod "7d2cd554-fedb-43fb-aefd-0f82a6c265e4" (UID: "7d2cd554-fedb-43fb-aefd-0f82a6c265e4"). InnerVolumeSpecName "kube-api-access-4d979". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.636836 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-inventory" (OuterVolumeSpecName: "inventory") pod "7d2cd554-fedb-43fb-aefd-0f82a6c265e4" (UID: "7d2cd554-fedb-43fb-aefd-0f82a6c265e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.647638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7d2cd554-fedb-43fb-aefd-0f82a6c265e4" (UID: "7d2cd554-fedb-43fb-aefd-0f82a6c265e4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.713921 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.713957 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d979\" (UniqueName: \"kubernetes.io/projected/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-kube-api-access-4d979\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:22 crc kubenswrapper[4743]: I0310 15:38:22.713971 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d2cd554-fedb-43fb-aefd-0f82a6c265e4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.028673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" event={"ID":"7d2cd554-fedb-43fb-aefd-0f82a6c265e4","Type":"ContainerDied","Data":"4b91082bbf60fda7d20577f0aadd0a580351c4cf2e843ea4b3988a3333498ff0"} Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.028731 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b91082bbf60fda7d20577f0aadd0a580351c4cf2e843ea4b3988a3333498ff0" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.028784 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.132707 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p"] Mar 10 15:38:23 crc kubenswrapper[4743]: E0310 15:38:23.133350 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2cd554-fedb-43fb-aefd-0f82a6c265e4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.133372 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2cd554-fedb-43fb-aefd-0f82a6c265e4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.133600 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2cd554-fedb-43fb-aefd-0f82a6c265e4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.134230 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.137527 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.137529 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.137762 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.137853 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.161423 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p"] Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.227546 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97pp5\" (UniqueName: \"kubernetes.io/projected/5ac39670-4aeb-410f-a275-9b011cb8a21c-kube-api-access-97pp5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.227903 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.228062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.329981 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97pp5\" (UniqueName: \"kubernetes.io/projected/5ac39670-4aeb-410f-a275-9b011cb8a21c-kube-api-access-97pp5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.330090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.330185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.347508 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.347491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.351580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97pp5\" (UniqueName: \"kubernetes.io/projected/5ac39670-4aeb-410f-a275-9b011cb8a21c-kube-api-access-97pp5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qnj4p\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:23 crc kubenswrapper[4743]: I0310 15:38:23.476441 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:38:24 crc kubenswrapper[4743]: I0310 15:38:24.047864 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p"] Mar 10 15:38:25 crc kubenswrapper[4743]: I0310 15:38:25.064484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" event={"ID":"5ac39670-4aeb-410f-a275-9b011cb8a21c","Type":"ContainerStarted","Data":"8875f3125a737bbdeafb18d3230ad4e5fef7148383f046266f96459fd5a6154e"} Mar 10 15:38:25 crc kubenswrapper[4743]: I0310 15:38:25.065260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" event={"ID":"5ac39670-4aeb-410f-a275-9b011cb8a21c","Type":"ContainerStarted","Data":"ebd944fed8cf659919e564afa8fd775d17ea74eafc747dd2e8ce4cb518be3e3d"} Mar 10 15:38:25 crc kubenswrapper[4743]: I0310 15:38:25.094134 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" podStartSLOduration=1.634392732 podStartE2EDuration="2.094111461s" podCreationTimestamp="2026-03-10 15:38:23 +0000 UTC" firstStartedPulling="2026-03-10 15:38:24.060222107 +0000 UTC m=+1968.767036865" lastFinishedPulling="2026-03-10 15:38:24.519940846 +0000 UTC m=+1969.226755594" observedRunningTime="2026-03-10 15:38:25.091879117 +0000 UTC m=+1969.798693915" watchObservedRunningTime="2026-03-10 15:38:25.094111461 +0000 UTC m=+1969.800926239" Mar 10 15:38:54 crc kubenswrapper[4743]: I0310 15:38:54.044417 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nxpnp"] Mar 10 15:38:54 crc kubenswrapper[4743]: I0310 15:38:54.056303 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nxpnp"] Mar 10 15:38:54 crc kubenswrapper[4743]: I0310 15:38:54.729358 4743 scope.go:117] "RemoveContainer" containerID="71168cb29e252d52842812232b59f3ab91750eedb5242abe78a9583409ffdbb5" Mar 10 15:38:54 crc kubenswrapper[4743]: I0310 15:38:54.774710 4743 scope.go:117] "RemoveContainer" containerID="7469fa0f036b039a82ebd805a4b2cf98465a614c7a0da55c947089bbf1fc3e9b" Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.040282 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-be97-account-create-update-9xhqq"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.054867 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-95c8-account-create-update-8vlkw"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.065292 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-wkklm"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.075349 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-392d-account-create-update-cvlg4"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.084619 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-95c8-account-create-update-8vlkw"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.092489 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-wkklm"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.108829 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-392d-account-create-update-cvlg4"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.115673 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-be97-account-create-update-9xhqq"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.125422 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-lv5qs"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.133258 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-lv5qs"] Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.940088 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0532548b-3818-4f8f-bf72-c7ba529b4a8c" path="/var/lib/kubelet/pods/0532548b-3818-4f8f-bf72-c7ba529b4a8c/volumes" Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.941698 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354ad36f-b0fb-4520-b5ed-b9ca3a65579c" path="/var/lib/kubelet/pods/354ad36f-b0fb-4520-b5ed-b9ca3a65579c/volumes" Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.943160 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c02071d-0a01-4d7d-bbd5-727544dd7fbe" path="/var/lib/kubelet/pods/8c02071d-0a01-4d7d-bbd5-727544dd7fbe/volumes" Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.945152 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab66fedc-1151-429b-9301-14e20af0cc44" path="/var/lib/kubelet/pods/ab66fedc-1151-429b-9301-14e20af0cc44/volumes" Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.947349 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b692e051-bc25-4039-ac2d-10d17a93a44a" path="/var/lib/kubelet/pods/b692e051-bc25-4039-ac2d-10d17a93a44a/volumes" Mar 10 15:38:55 crc kubenswrapper[4743]: I0310 15:38:55.948576 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecaa5958-eab2-4af0-b0ba-43d0c91517f1" path="/var/lib/kubelet/pods/ecaa5958-eab2-4af0-b0ba-43d0c91517f1/volumes" Mar 10 15:39:01 crc kubenswrapper[4743]: I0310 15:39:01.448448 4743 generic.go:334] "Generic (PLEG): container finished" podID="5ac39670-4aeb-410f-a275-9b011cb8a21c" containerID="8875f3125a737bbdeafb18d3230ad4e5fef7148383f046266f96459fd5a6154e" exitCode=0 Mar 10 15:39:01 crc kubenswrapper[4743]: I0310 15:39:01.448552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" event={"ID":"5ac39670-4aeb-410f-a275-9b011cb8a21c","Type":"ContainerDied","Data":"8875f3125a737bbdeafb18d3230ad4e5fef7148383f046266f96459fd5a6154e"} Mar 10 15:39:02 crc kubenswrapper[4743]: I0310 15:39:02.892352 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.078076 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97pp5\" (UniqueName: \"kubernetes.io/projected/5ac39670-4aeb-410f-a275-9b011cb8a21c-kube-api-access-97pp5\") pod \"5ac39670-4aeb-410f-a275-9b011cb8a21c\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.078195 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-inventory\") pod \"5ac39670-4aeb-410f-a275-9b011cb8a21c\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.078297 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-ssh-key-openstack-edpm-ipam\") pod \"5ac39670-4aeb-410f-a275-9b011cb8a21c\" (UID: \"5ac39670-4aeb-410f-a275-9b011cb8a21c\") " Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.084063 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac39670-4aeb-410f-a275-9b011cb8a21c-kube-api-access-97pp5" (OuterVolumeSpecName: "kube-api-access-97pp5") pod "5ac39670-4aeb-410f-a275-9b011cb8a21c" (UID: "5ac39670-4aeb-410f-a275-9b011cb8a21c"). InnerVolumeSpecName "kube-api-access-97pp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.131968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-inventory" (OuterVolumeSpecName: "inventory") pod "5ac39670-4aeb-410f-a275-9b011cb8a21c" (UID: "5ac39670-4aeb-410f-a275-9b011cb8a21c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.132475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5ac39670-4aeb-410f-a275-9b011cb8a21c" (UID: "5ac39670-4aeb-410f-a275-9b011cb8a21c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.181135 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97pp5\" (UniqueName: \"kubernetes.io/projected/5ac39670-4aeb-410f-a275-9b011cb8a21c-kube-api-access-97pp5\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.181178 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.181190 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ac39670-4aeb-410f-a275-9b011cb8a21c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.475576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" event={"ID":"5ac39670-4aeb-410f-a275-9b011cb8a21c","Type":"ContainerDied","Data":"ebd944fed8cf659919e564afa8fd775d17ea74eafc747dd2e8ce4cb518be3e3d"} Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.475899 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebd944fed8cf659919e564afa8fd775d17ea74eafc747dd2e8ce4cb518be3e3d" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.476018 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qnj4p" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.554599 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct"] Mar 10 15:39:03 crc kubenswrapper[4743]: E0310 15:39:03.555162 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac39670-4aeb-410f-a275-9b011cb8a21c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.555191 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac39670-4aeb-410f-a275-9b011cb8a21c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.555435 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac39670-4aeb-410f-a275-9b011cb8a21c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.556176 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.558663 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.558695 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.558670 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.558845 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.568031 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct"] Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.592941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.593062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qlv\" (UniqueName: \"kubernetes.io/projected/273ce723-179f-468a-b890-3336be7763a0-kube-api-access-l7qlv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.593216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.696232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.696364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qlv\" (UniqueName: \"kubernetes.io/projected/273ce723-179f-468a-b890-3336be7763a0-kube-api-access-l7qlv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.696447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.700595 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.701757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.728954 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qlv\" (UniqueName: \"kubernetes.io/projected/273ce723-179f-468a-b890-3336be7763a0-kube-api-access-l7qlv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l22ct\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:03 crc kubenswrapper[4743]: I0310 15:39:03.878614 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:04 crc kubenswrapper[4743]: I0310 15:39:04.410758 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct"] Mar 10 15:39:04 crc kubenswrapper[4743]: I0310 15:39:04.485305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" event={"ID":"273ce723-179f-468a-b890-3336be7763a0","Type":"ContainerStarted","Data":"634f0db4052057c9ba64a1966e81a6bc3ba03d551c53906d81eede04790f5c13"} Mar 10 15:39:05 crc kubenswrapper[4743]: I0310 15:39:05.499168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" event={"ID":"273ce723-179f-468a-b890-3336be7763a0","Type":"ContainerStarted","Data":"373718a05c07b9f0f562dba005273e50c7dbcf2a657706900a76b13a9eed2063"} Mar 10 15:39:05 crc kubenswrapper[4743]: I0310 15:39:05.516787 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" podStartSLOduration=2.032649652 podStartE2EDuration="2.516763965s" podCreationTimestamp="2026-03-10 15:39:03 +0000 UTC" firstStartedPulling="2026-03-10 15:39:04.415114154 +0000 UTC m=+2009.121928902" lastFinishedPulling="2026-03-10 15:39:04.899228467 +0000 UTC m=+2009.606043215" observedRunningTime="2026-03-10 15:39:05.514900022 +0000 UTC m=+2010.221714780" watchObservedRunningTime="2026-03-10 15:39:05.516763965 +0000 UTC m=+2010.223578733" Mar 10 15:39:34 crc kubenswrapper[4743]: I0310 15:39:34.050090 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chr79"] Mar 10 15:39:34 crc kubenswrapper[4743]: I0310 15:39:34.061661 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chr79"] Mar 10 15:39:35 crc kubenswrapper[4743]: I0310 15:39:35.932974 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed9e7ad-ba13-436c-bf27-46d65eb60af3" path="/var/lib/kubelet/pods/aed9e7ad-ba13-436c-bf27-46d65eb60af3/volumes" Mar 10 15:39:53 crc kubenswrapper[4743]: I0310 15:39:53.065494 4743 generic.go:334] "Generic (PLEG): container finished" podID="273ce723-179f-468a-b890-3336be7763a0" containerID="373718a05c07b9f0f562dba005273e50c7dbcf2a657706900a76b13a9eed2063" exitCode=0 Mar 10 15:39:53 crc kubenswrapper[4743]: I0310 15:39:53.065590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" event={"ID":"273ce723-179f-468a-b890-3336be7763a0","Type":"ContainerDied","Data":"373718a05c07b9f0f562dba005273e50c7dbcf2a657706900a76b13a9eed2063"} Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.555791 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.667063 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-inventory\") pod \"273ce723-179f-468a-b890-3336be7763a0\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.667290 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qlv\" (UniqueName: \"kubernetes.io/projected/273ce723-179f-468a-b890-3336be7763a0-kube-api-access-l7qlv\") pod \"273ce723-179f-468a-b890-3336be7763a0\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.667365 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-ssh-key-openstack-edpm-ipam\") pod \"273ce723-179f-468a-b890-3336be7763a0\" (UID: \"273ce723-179f-468a-b890-3336be7763a0\") " Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.676040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273ce723-179f-468a-b890-3336be7763a0-kube-api-access-l7qlv" (OuterVolumeSpecName: "kube-api-access-l7qlv") pod "273ce723-179f-468a-b890-3336be7763a0" (UID: "273ce723-179f-468a-b890-3336be7763a0"). InnerVolumeSpecName "kube-api-access-l7qlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.701382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "273ce723-179f-468a-b890-3336be7763a0" (UID: "273ce723-179f-468a-b890-3336be7763a0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.723924 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-inventory" (OuterVolumeSpecName: "inventory") pod "273ce723-179f-468a-b890-3336be7763a0" (UID: "273ce723-179f-468a-b890-3336be7763a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.771975 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qlv\" (UniqueName: \"kubernetes.io/projected/273ce723-179f-468a-b890-3336be7763a0-kube-api-access-l7qlv\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.772035 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.772055 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/273ce723-179f-468a-b890-3336be7763a0-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.903363 4743 scope.go:117] "RemoveContainer" containerID="a8dd7b460bd2e7414390c8d7085a0dcb6b84aac52fe5cdeda52c284f014e06be" Mar 10 15:39:54 crc kubenswrapper[4743]: I0310 15:39:54.970996 4743 scope.go:117] "RemoveContainer" containerID="35f28da742cacba26ec83c9cc64c97f16d333200d4989f59cb8e895c24055d85" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.005139 4743 scope.go:117] "RemoveContainer" containerID="5eb0c295f33cd6777c1ef34bb445783dff19e6f5f544635d824bf84d494ce516" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.031501 4743 scope.go:117] "RemoveContainer" containerID="84baa1f756512af822219736b6626e3fc403936fa6127e5fc358eaa54b095f9b" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.062949 4743 scope.go:117] "RemoveContainer" containerID="f03eb86dbeab0d1d4f8221d962b019a810019848adf138a66b41c7b77c8f0d9a" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.090880 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" event={"ID":"273ce723-179f-468a-b890-3336be7763a0","Type":"ContainerDied","Data":"634f0db4052057c9ba64a1966e81a6bc3ba03d551c53906d81eede04790f5c13"} Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.090926 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="634f0db4052057c9ba64a1966e81a6bc3ba03d551c53906d81eede04790f5c13" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.091011 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l22ct" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.104683 4743 scope.go:117] "RemoveContainer" containerID="4f82bd6b7be4ae3036d2af4517825cf4f4e1689e3a61947d39477f32bcdea00c" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.146522 4743 scope.go:117] "RemoveContainer" containerID="ff49b55f2a74a9839399581c6e40028716b275ce41497fdcd5d42c8d093f713f" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.216990 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-twkdg"] Mar 10 15:39:55 crc kubenswrapper[4743]: E0310 15:39:55.218141 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273ce723-179f-468a-b890-3336be7763a0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.218170 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="273ce723-179f-468a-b890-3336be7763a0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.218544 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="273ce723-179f-468a-b890-3336be7763a0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.219355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.224162 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.224347 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.224420 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.225003 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.230327 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-twkdg"] Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.283726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74v7t\" (UniqueName: \"kubernetes.io/projected/8c7fcae9-f7ce-458c-ba25-87d2e932de62-kube-api-access-74v7t\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.283842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.283925 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.386112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74v7t\" (UniqueName: \"kubernetes.io/projected/8c7fcae9-f7ce-458c-ba25-87d2e932de62-kube-api-access-74v7t\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.386167 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.386208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.391211 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.391386 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.402422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74v7t\" (UniqueName: \"kubernetes.io/projected/8c7fcae9-f7ce-458c-ba25-87d2e932de62-kube-api-access-74v7t\") pod \"ssh-known-hosts-edpm-deployment-twkdg\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:55 crc kubenswrapper[4743]: I0310 15:39:55.534029 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:39:56 crc kubenswrapper[4743]: I0310 15:39:56.232277 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-twkdg"] Mar 10 15:39:57 crc kubenswrapper[4743]: I0310 15:39:57.126068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" event={"ID":"8c7fcae9-f7ce-458c-ba25-87d2e932de62","Type":"ContainerStarted","Data":"a79a2a5413f6825adc6dbdafbaa594bf7321b75cea272807a777a1fbbe48934d"} Mar 10 15:39:57 crc kubenswrapper[4743]: I0310 15:39:57.126299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" event={"ID":"8c7fcae9-f7ce-458c-ba25-87d2e932de62","Type":"ContainerStarted","Data":"bcb5e29297c31461a939a8de13ad0d673c5275a8cfd4d7ded20aed00987b2718"} Mar 10 15:39:57 crc kubenswrapper[4743]: I0310 15:39:57.157901 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" podStartSLOduration=1.647189099 podStartE2EDuration="2.157877748s" podCreationTimestamp="2026-03-10 15:39:55 +0000 UTC" firstStartedPulling="2026-03-10 15:39:56.24265472 +0000 UTC m=+2060.949469468" lastFinishedPulling="2026-03-10 15:39:56.753343329 +0000 UTC m=+2061.460158117" observedRunningTime="2026-03-10 15:39:57.145881266 +0000 UTC m=+2061.852696054" watchObservedRunningTime="2026-03-10 15:39:57.157877748 +0000 UTC m=+2061.864692506" Mar 10 15:39:58 crc kubenswrapper[4743]: I0310 15:39:58.059374 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4rs44"] Mar 10 15:39:58 crc kubenswrapper[4743]: I0310 15:39:58.073696 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4rs44"] Mar 10 15:39:59 crc kubenswrapper[4743]: I0310 15:39:59.037960 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-l2xjq"] Mar 10 15:39:59 crc kubenswrapper[4743]: I0310 15:39:59.050050 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-l2xjq"] Mar 10 15:39:59 crc kubenswrapper[4743]: I0310 15:39:59.934636 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f18369-c26d-4389-a49c-e0178c8d6db3" path="/var/lib/kubelet/pods/67f18369-c26d-4389-a49c-e0178c8d6db3/volumes" Mar 10 15:39:59 crc kubenswrapper[4743]: I0310 15:39:59.935898 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b077b1c-05fd-4ef3-8adb-42ff870116c9" path="/var/lib/kubelet/pods/8b077b1c-05fd-4ef3-8adb-42ff870116c9/volumes" Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.156595 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552620-gncgx"] Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.159299 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552620-gncgx" Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.163222 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.163958 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.164138 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.169278 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552620-gncgx"] Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.193658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4w57\" (UniqueName: \"kubernetes.io/projected/acab6558-b67f-47a2-a721-59b9a7206996-kube-api-access-v4w57\") pod \"auto-csr-approver-29552620-gncgx\" (UID: \"acab6558-b67f-47a2-a721-59b9a7206996\") " pod="openshift-infra/auto-csr-approver-29552620-gncgx" Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.296155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4w57\" (UniqueName: \"kubernetes.io/projected/acab6558-b67f-47a2-a721-59b9a7206996-kube-api-access-v4w57\") pod \"auto-csr-approver-29552620-gncgx\" (UID: \"acab6558-b67f-47a2-a721-59b9a7206996\") " pod="openshift-infra/auto-csr-approver-29552620-gncgx" Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.319505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4w57\" (UniqueName: \"kubernetes.io/projected/acab6558-b67f-47a2-a721-59b9a7206996-kube-api-access-v4w57\") pod \"auto-csr-approver-29552620-gncgx\" (UID: \"acab6558-b67f-47a2-a721-59b9a7206996\") " pod="openshift-infra/auto-csr-approver-29552620-gncgx" Mar 10 15:40:00 crc kubenswrapper[4743]: I0310 15:40:00.487096 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552620-gncgx" Mar 10 15:40:01 crc kubenswrapper[4743]: I0310 15:40:01.009793 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552620-gncgx"] Mar 10 15:40:01 crc kubenswrapper[4743]: I0310 15:40:01.175679 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552620-gncgx" event={"ID":"acab6558-b67f-47a2-a721-59b9a7206996","Type":"ContainerStarted","Data":"823bace9b21c417d8dc11add31538b15ba2d36f0add567c0ddaffa424e431a6c"} Mar 10 15:40:03 crc kubenswrapper[4743]: I0310 15:40:03.200448 4743 generic.go:334] "Generic (PLEG): container finished" podID="acab6558-b67f-47a2-a721-59b9a7206996" containerID="46573d2ad2a41fe1b1d66c8c4902a281fa145c0c728caa297991294a1291a76d" exitCode=0 Mar 10 15:40:03 crc kubenswrapper[4743]: I0310 15:40:03.201241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552620-gncgx" event={"ID":"acab6558-b67f-47a2-a721-59b9a7206996","Type":"ContainerDied","Data":"46573d2ad2a41fe1b1d66c8c4902a281fa145c0c728caa297991294a1291a76d"} Mar 10 15:40:04 crc kubenswrapper[4743]: I0310 15:40:04.216670 4743 generic.go:334] "Generic (PLEG): container finished" podID="8c7fcae9-f7ce-458c-ba25-87d2e932de62" containerID="a79a2a5413f6825adc6dbdafbaa594bf7321b75cea272807a777a1fbbe48934d" exitCode=0 Mar 10 15:40:04 crc kubenswrapper[4743]: I0310 15:40:04.217169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" event={"ID":"8c7fcae9-f7ce-458c-ba25-87d2e932de62","Type":"ContainerDied","Data":"a79a2a5413f6825adc6dbdafbaa594bf7321b75cea272807a777a1fbbe48934d"} Mar 10 15:40:04 crc kubenswrapper[4743]: I0310 15:40:04.575074 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552620-gncgx" Mar 10 15:40:04 crc kubenswrapper[4743]: I0310 15:40:04.703668 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4w57\" (UniqueName: \"kubernetes.io/projected/acab6558-b67f-47a2-a721-59b9a7206996-kube-api-access-v4w57\") pod \"acab6558-b67f-47a2-a721-59b9a7206996\" (UID: \"acab6558-b67f-47a2-a721-59b9a7206996\") " Mar 10 15:40:04 crc kubenswrapper[4743]: I0310 15:40:04.716159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acab6558-b67f-47a2-a721-59b9a7206996-kube-api-access-v4w57" (OuterVolumeSpecName: "kube-api-access-v4w57") pod "acab6558-b67f-47a2-a721-59b9a7206996" (UID: "acab6558-b67f-47a2-a721-59b9a7206996"). InnerVolumeSpecName "kube-api-access-v4w57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:40:04 crc kubenswrapper[4743]: I0310 15:40:04.806439 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4w57\" (UniqueName: \"kubernetes.io/projected/acab6558-b67f-47a2-a721-59b9a7206996-kube-api-access-v4w57\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.227378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552620-gncgx" event={"ID":"acab6558-b67f-47a2-a721-59b9a7206996","Type":"ContainerDied","Data":"823bace9b21c417d8dc11add31538b15ba2d36f0add567c0ddaffa424e431a6c"} Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.227411 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552620-gncgx" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.227429 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823bace9b21c417d8dc11add31538b15ba2d36f0add567c0ddaffa424e431a6c" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.664804 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552614-q49mz"] Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.678298 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552614-q49mz"] Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.713789 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.830482 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-inventory-0\") pod \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.830547 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74v7t\" (UniqueName: \"kubernetes.io/projected/8c7fcae9-f7ce-458c-ba25-87d2e932de62-kube-api-access-74v7t\") pod \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.830779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-ssh-key-openstack-edpm-ipam\") pod \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\" (UID: \"8c7fcae9-f7ce-458c-ba25-87d2e932de62\") " Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.834365 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7fcae9-f7ce-458c-ba25-87d2e932de62-kube-api-access-74v7t" (OuterVolumeSpecName: "kube-api-access-74v7t") pod "8c7fcae9-f7ce-458c-ba25-87d2e932de62" (UID: "8c7fcae9-f7ce-458c-ba25-87d2e932de62"). InnerVolumeSpecName "kube-api-access-74v7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.858376 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c7fcae9-f7ce-458c-ba25-87d2e932de62" (UID: "8c7fcae9-f7ce-458c-ba25-87d2e932de62"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.872908 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8c7fcae9-f7ce-458c-ba25-87d2e932de62" (UID: "8c7fcae9-f7ce-458c-ba25-87d2e932de62"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.930200 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="519df03f-1942-4901-b698-7f2d2703704b" path="/var/lib/kubelet/pods/519df03f-1942-4901-b698-7f2d2703704b/volumes" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.933484 4743 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.933527 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74v7t\" (UniqueName: \"kubernetes.io/projected/8c7fcae9-f7ce-458c-ba25-87d2e932de62-kube-api-access-74v7t\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:05 crc kubenswrapper[4743]: I0310 15:40:05.933550 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c7fcae9-f7ce-458c-ba25-87d2e932de62-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.240064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" event={"ID":"8c7fcae9-f7ce-458c-ba25-87d2e932de62","Type":"ContainerDied","Data":"bcb5e29297c31461a939a8de13ad0d673c5275a8cfd4d7ded20aed00987b2718"} Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.240117 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcb5e29297c31461a939a8de13ad0d673c5275a8cfd4d7ded20aed00987b2718" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.240128 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-twkdg" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.327239 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p"] Mar 10 15:40:06 crc kubenswrapper[4743]: E0310 15:40:06.327927 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7fcae9-f7ce-458c-ba25-87d2e932de62" containerName="ssh-known-hosts-edpm-deployment" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.327943 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7fcae9-f7ce-458c-ba25-87d2e932de62" containerName="ssh-known-hosts-edpm-deployment" Mar 10 15:40:06 crc kubenswrapper[4743]: E0310 15:40:06.327964 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acab6558-b67f-47a2-a721-59b9a7206996" containerName="oc" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.327970 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="acab6558-b67f-47a2-a721-59b9a7206996" containerName="oc" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.328203 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="acab6558-b67f-47a2-a721-59b9a7206996" containerName="oc" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.328222 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7fcae9-f7ce-458c-ba25-87d2e932de62" containerName="ssh-known-hosts-edpm-deployment" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.328887 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.331089 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.332607 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.332777 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.332871 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.337851 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p"] Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.449028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.449116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.449238 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/ca697d29-a195-499a-8e64-1688d6748d0c-kube-api-access-8mwwb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.550910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.550994 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.551031 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/ca697d29-a195-499a-8e64-1688d6748d0c-kube-api-access-8mwwb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.556055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.556375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.569975 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/ca697d29-a195-499a-8e64-1688d6748d0c-kube-api-access-8mwwb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bgq9p\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:06 crc kubenswrapper[4743]: I0310 15:40:06.647494 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:07 crc kubenswrapper[4743]: I0310 15:40:07.255649 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p"] Mar 10 15:40:07 crc kubenswrapper[4743]: W0310 15:40:07.266424 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca697d29_a195_499a_8e64_1688d6748d0c.slice/crio-fef8bf6f9f122abed469d1a63c26b6885a4aef07efcb2e9b1c5bfbda80287496 WatchSource:0}: Error finding container fef8bf6f9f122abed469d1a63c26b6885a4aef07efcb2e9b1c5bfbda80287496: Status 404 returned error can't find the container with id fef8bf6f9f122abed469d1a63c26b6885a4aef07efcb2e9b1c5bfbda80287496 Mar 10 15:40:08 crc kubenswrapper[4743]: I0310 15:40:08.270392 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" event={"ID":"ca697d29-a195-499a-8e64-1688d6748d0c","Type":"ContainerStarted","Data":"c4201bd5462b20fa2081c6744dd51ed9d9239ccebf97f57641ab75c41be76ebd"} Mar 10 15:40:08 crc kubenswrapper[4743]: I0310 15:40:08.270800 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" event={"ID":"ca697d29-a195-499a-8e64-1688d6748d0c","Type":"ContainerStarted","Data":"fef8bf6f9f122abed469d1a63c26b6885a4aef07efcb2e9b1c5bfbda80287496"} Mar 10 15:40:08 crc kubenswrapper[4743]: I0310 15:40:08.308175 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" podStartSLOduration=1.883851259 podStartE2EDuration="2.3081413s" podCreationTimestamp="2026-03-10 15:40:06 +0000 UTC" firstStartedPulling="2026-03-10 15:40:07.269773969 +0000 UTC m=+2071.976588717" lastFinishedPulling="2026-03-10 15:40:07.69406397 +0000 UTC m=+2072.400878758" observedRunningTime="2026-03-10 15:40:08.29091388 +0000 UTC m=+2072.997728638" watchObservedRunningTime="2026-03-10 15:40:08.3081413 +0000 UTC m=+2073.014956088" Mar 10 15:40:11 crc kubenswrapper[4743]: I0310 15:40:11.252713 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:40:11 crc kubenswrapper[4743]: I0310 15:40:11.253176 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:40:16 crc kubenswrapper[4743]: I0310 15:40:16.353788 4743 generic.go:334] "Generic (PLEG): container finished" podID="ca697d29-a195-499a-8e64-1688d6748d0c" containerID="c4201bd5462b20fa2081c6744dd51ed9d9239ccebf97f57641ab75c41be76ebd" exitCode=0 Mar 10 15:40:16 crc kubenswrapper[4743]: I0310 15:40:16.353882 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" event={"ID":"ca697d29-a195-499a-8e64-1688d6748d0c","Type":"ContainerDied","Data":"c4201bd5462b20fa2081c6744dd51ed9d9239ccebf97f57641ab75c41be76ebd"} Mar 10 15:40:17 crc kubenswrapper[4743]: I0310 15:40:17.821503 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:17 crc kubenswrapper[4743]: I0310 15:40:17.909621 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-inventory\") pod \"ca697d29-a195-499a-8e64-1688d6748d0c\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " Mar 10 15:40:17 crc kubenswrapper[4743]: I0310 15:40:17.909927 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-ssh-key-openstack-edpm-ipam\") pod \"ca697d29-a195-499a-8e64-1688d6748d0c\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " Mar 10 15:40:17 crc kubenswrapper[4743]: I0310 15:40:17.910427 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/ca697d29-a195-499a-8e64-1688d6748d0c-kube-api-access-8mwwb\") pod \"ca697d29-a195-499a-8e64-1688d6748d0c\" (UID: \"ca697d29-a195-499a-8e64-1688d6748d0c\") " Mar 10 15:40:17 crc kubenswrapper[4743]: I0310 15:40:17.914850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca697d29-a195-499a-8e64-1688d6748d0c-kube-api-access-8mwwb" (OuterVolumeSpecName: "kube-api-access-8mwwb") pod "ca697d29-a195-499a-8e64-1688d6748d0c" (UID: "ca697d29-a195-499a-8e64-1688d6748d0c"). InnerVolumeSpecName "kube-api-access-8mwwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:40:17 crc kubenswrapper[4743]: I0310 15:40:17.944018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-inventory" (OuterVolumeSpecName: "inventory") pod "ca697d29-a195-499a-8e64-1688d6748d0c" (UID: "ca697d29-a195-499a-8e64-1688d6748d0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:17 crc kubenswrapper[4743]: I0310 15:40:17.951226 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca697d29-a195-499a-8e64-1688d6748d0c" (UID: "ca697d29-a195-499a-8e64-1688d6748d0c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.012906 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.012937 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mwwb\" (UniqueName: \"kubernetes.io/projected/ca697d29-a195-499a-8e64-1688d6748d0c-kube-api-access-8mwwb\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.012948 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca697d29-a195-499a-8e64-1688d6748d0c-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.383406 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" event={"ID":"ca697d29-a195-499a-8e64-1688d6748d0c","Type":"ContainerDied","Data":"fef8bf6f9f122abed469d1a63c26b6885a4aef07efcb2e9b1c5bfbda80287496"} Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.383490 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fef8bf6f9f122abed469d1a63c26b6885a4aef07efcb2e9b1c5bfbda80287496" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.383509 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bgq9p" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.506660 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n"] Mar 10 15:40:18 crc kubenswrapper[4743]: E0310 15:40:18.507317 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca697d29-a195-499a-8e64-1688d6748d0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.507392 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca697d29-a195-499a-8e64-1688d6748d0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.507648 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca697d29-a195-499a-8e64-1688d6748d0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.508329 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.510590 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.511339 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.511489 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.511588 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.520083 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n"] Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.625401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.625489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.625644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6nl\" (UniqueName: \"kubernetes.io/projected/85abb06e-1162-44ac-93bf-9db7fb08a980-kube-api-access-9l6nl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.727546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6nl\" (UniqueName: \"kubernetes.io/projected/85abb06e-1162-44ac-93bf-9db7fb08a980-kube-api-access-9l6nl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.727708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.727757 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.733649 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.734779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.745065 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6nl\" (UniqueName: \"kubernetes.io/projected/85abb06e-1162-44ac-93bf-9db7fb08a980-kube-api-access-9l6nl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:18 crc kubenswrapper[4743]: I0310 15:40:18.825083 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:19 crc kubenswrapper[4743]: I0310 15:40:19.375538 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n"] Mar 10 15:40:19 crc kubenswrapper[4743]: W0310 15:40:19.382763 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85abb06e_1162_44ac_93bf_9db7fb08a980.slice/crio-edcd201cad1a293b6b678310f32bca02ec220b7d12f530e5ab008f54ea3d5c49 WatchSource:0}: Error finding container edcd201cad1a293b6b678310f32bca02ec220b7d12f530e5ab008f54ea3d5c49: Status 404 returned error can't find the container with id edcd201cad1a293b6b678310f32bca02ec220b7d12f530e5ab008f54ea3d5c49 Mar 10 15:40:19 crc kubenswrapper[4743]: I0310 15:40:19.393405 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" event={"ID":"85abb06e-1162-44ac-93bf-9db7fb08a980","Type":"ContainerStarted","Data":"edcd201cad1a293b6b678310f32bca02ec220b7d12f530e5ab008f54ea3d5c49"} Mar 10 15:40:20 crc kubenswrapper[4743]: I0310 15:40:20.405806 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" event={"ID":"85abb06e-1162-44ac-93bf-9db7fb08a980","Type":"ContainerStarted","Data":"a928b49cc1e0f37d308b3df7c032675b0793587d095b5163ce6e9458d6f4e5de"} Mar 10 15:40:20 crc kubenswrapper[4743]: I0310 15:40:20.433480 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" podStartSLOduration=1.916807162 podStartE2EDuration="2.433456641s" podCreationTimestamp="2026-03-10 15:40:18 +0000 UTC" firstStartedPulling="2026-03-10 15:40:19.387227886 +0000 UTC m=+2084.094042634" lastFinishedPulling="2026-03-10 15:40:19.903877375 +0000 UTC m=+2084.610692113" observedRunningTime="2026-03-10 15:40:20.424391203 +0000 UTC m=+2085.131205941" watchObservedRunningTime="2026-03-10 15:40:20.433456641 +0000 UTC m=+2085.140271389" Mar 10 15:40:29 crc kubenswrapper[4743]: I0310 15:40:29.495279 4743 generic.go:334] "Generic (PLEG): container finished" podID="85abb06e-1162-44ac-93bf-9db7fb08a980" containerID="a928b49cc1e0f37d308b3df7c032675b0793587d095b5163ce6e9458d6f4e5de" exitCode=0 Mar 10 15:40:29 crc kubenswrapper[4743]: I0310 15:40:29.495798 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" event={"ID":"85abb06e-1162-44ac-93bf-9db7fb08a980","Type":"ContainerDied","Data":"a928b49cc1e0f37d308b3df7c032675b0793587d095b5163ce6e9458d6f4e5de"} Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.001869 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.109761 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-inventory\") pod \"85abb06e-1162-44ac-93bf-9db7fb08a980\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.109873 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l6nl\" (UniqueName: \"kubernetes.io/projected/85abb06e-1162-44ac-93bf-9db7fb08a980-kube-api-access-9l6nl\") pod \"85abb06e-1162-44ac-93bf-9db7fb08a980\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.109901 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-ssh-key-openstack-edpm-ipam\") pod \"85abb06e-1162-44ac-93bf-9db7fb08a980\" (UID: \"85abb06e-1162-44ac-93bf-9db7fb08a980\") " Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.119968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85abb06e-1162-44ac-93bf-9db7fb08a980-kube-api-access-9l6nl" (OuterVolumeSpecName: "kube-api-access-9l6nl") pod "85abb06e-1162-44ac-93bf-9db7fb08a980" (UID: "85abb06e-1162-44ac-93bf-9db7fb08a980"). InnerVolumeSpecName "kube-api-access-9l6nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.137237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "85abb06e-1162-44ac-93bf-9db7fb08a980" (UID: "85abb06e-1162-44ac-93bf-9db7fb08a980"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.139793 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-inventory" (OuterVolumeSpecName: "inventory") pod "85abb06e-1162-44ac-93bf-9db7fb08a980" (UID: "85abb06e-1162-44ac-93bf-9db7fb08a980"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.213191 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.213345 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l6nl\" (UniqueName: \"kubernetes.io/projected/85abb06e-1162-44ac-93bf-9db7fb08a980-kube-api-access-9l6nl\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.213428 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85abb06e-1162-44ac-93bf-9db7fb08a980-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.515952 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" event={"ID":"85abb06e-1162-44ac-93bf-9db7fb08a980","Type":"ContainerDied","Data":"edcd201cad1a293b6b678310f32bca02ec220b7d12f530e5ab008f54ea3d5c49"} Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.516252 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edcd201cad1a293b6b678310f32bca02ec220b7d12f530e5ab008f54ea3d5c49" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.516141 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.593208 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk"] Mar 10 15:40:31 crc kubenswrapper[4743]: E0310 15:40:31.593700 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85abb06e-1162-44ac-93bf-9db7fb08a980" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.593727 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="85abb06e-1162-44ac-93bf-9db7fb08a980" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.594006 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="85abb06e-1162-44ac-93bf-9db7fb08a980" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.594841 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.596662 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.597225 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.597990 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.598230 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.598402 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.598554 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.598925 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.599121 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.606050 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk"] Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722001 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722054 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722124 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722168 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722366 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf76z\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-kube-api-access-sf76z\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722600 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.722631 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.824767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.824857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.824905 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.824941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.824980 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf76z\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-kube-api-access-sf76z\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.825384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.830024 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.830091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.830185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.830185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.832848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.834521 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.835636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.839503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.840195 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.841794 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.843550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.843727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.845843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.847371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf76z\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-kube-api-access-sf76z\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:31 crc kubenswrapper[4743]: I0310 15:40:31.920673 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:40:32 crc kubenswrapper[4743]: W0310 15:40:32.494660 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a1df74b_db86_4291_9bbc_202200fb7f7b.slice/crio-72b94e0c8afc2f5e19a394476eaa5f07bb9d0777801d22ed1d343814b74160ec WatchSource:0}: Error finding container 72b94e0c8afc2f5e19a394476eaa5f07bb9d0777801d22ed1d343814b74160ec: Status 404 returned error can't find the container with id 72b94e0c8afc2f5e19a394476eaa5f07bb9d0777801d22ed1d343814b74160ec Mar 10 15:40:32 crc kubenswrapper[4743]: I0310 15:40:32.498636 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk"] Mar 10 15:40:32 crc kubenswrapper[4743]: I0310 15:40:32.530776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" event={"ID":"2a1df74b-db86-4291-9bbc-202200fb7f7b","Type":"ContainerStarted","Data":"72b94e0c8afc2f5e19a394476eaa5f07bb9d0777801d22ed1d343814b74160ec"} Mar 10 15:40:33 crc kubenswrapper[4743]: I0310 15:40:33.541714 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" event={"ID":"2a1df74b-db86-4291-9bbc-202200fb7f7b","Type":"ContainerStarted","Data":"7dd09e100f07a773a87a922b7e084d7e2751a39e8d0607b5967abc07eb14d7f8"} Mar 10 15:40:33 crc kubenswrapper[4743]: I0310 15:40:33.572656 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" podStartSLOduration=2.120971743 podStartE2EDuration="2.572631963s" podCreationTimestamp="2026-03-10 15:40:31 +0000 UTC" firstStartedPulling="2026-03-10 15:40:32.498287279 +0000 UTC m=+2097.205102057" lastFinishedPulling="2026-03-10 15:40:32.949947529 +0000 UTC m=+2097.656762277" observedRunningTime="2026-03-10 15:40:33.566690064 +0000 UTC m=+2098.273504812" watchObservedRunningTime="2026-03-10 15:40:33.572631963 +0000 UTC m=+2098.279446711" Mar 10 15:40:41 crc kubenswrapper[4743]: I0310 15:40:41.252199 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:40:41 crc kubenswrapper[4743]: I0310 15:40:41.252841 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:40:43 crc kubenswrapper[4743]: I0310 15:40:43.057698 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gf5kp"] Mar 10 15:40:43 crc kubenswrapper[4743]: I0310 15:40:43.072914 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gf5kp"] Mar 10 15:40:43 crc kubenswrapper[4743]: I0310 15:40:43.932003 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3" path="/var/lib/kubelet/pods/b9af2a86-f428-4dd2-b4c2-a8bc9dc02be3/volumes" Mar 10 15:40:55 crc kubenswrapper[4743]: I0310 15:40:55.310583 4743 scope.go:117] "RemoveContainer" containerID="ac1e4cfb488b460578546001cf462b5ef2cf0dfd231f446088c2b77d097a0340" Mar 10 15:40:55 crc kubenswrapper[4743]: I0310 15:40:55.371137 4743 scope.go:117] "RemoveContainer" containerID="87c551d40a377cf99116a7b432d9abad33977a05937958fecd64640cf7c067e2" Mar 10 15:40:55 crc kubenswrapper[4743]: I0310 15:40:55.436329 4743 scope.go:117] "RemoveContainer" containerID="3437e368284b6c57ea8906b6b04141fd77c8e95cea732ef59b968b2ee59909ac" Mar 10 15:40:55 crc kubenswrapper[4743]: I0310 15:40:55.496983 4743 scope.go:117] "RemoveContainer" containerID="7f23faabcfffc57c70fc703b1ec09963ac7b1d751a3a2e37f2f723d3c983a24e" Mar 10 15:41:09 crc kubenswrapper[4743]: I0310 15:41:09.975546 4743 generic.go:334] "Generic (PLEG): container finished" podID="2a1df74b-db86-4291-9bbc-202200fb7f7b" containerID="7dd09e100f07a773a87a922b7e084d7e2751a39e8d0607b5967abc07eb14d7f8" exitCode=0 Mar 10 15:41:09 crc kubenswrapper[4743]: I0310 15:41:09.975652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" event={"ID":"2a1df74b-db86-4291-9bbc-202200fb7f7b","Type":"ContainerDied","Data":"7dd09e100f07a773a87a922b7e084d7e2751a39e8d0607b5967abc07eb14d7f8"} Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.252567 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.252665 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.252737 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.254272 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af106cec6bede362dcfa7f972c69b1f3686e1c5bf8c7d44260bf5cc40751d829"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.254443 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://af106cec6bede362dcfa7f972c69b1f3686e1c5bf8c7d44260bf5cc40751d829" gracePeriod=600 Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.529638 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.610570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-neutron-metadata-combined-ca-bundle\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611159 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ovn-combined-ca-bundle\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611197 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-bootstrap-combined-ca-bundle\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611486 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ssh-key-openstack-edpm-ipam\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-nova-combined-ca-bundle\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611732 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-inventory\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611776 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611837 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-repo-setup-combined-ca-bundle\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.611980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.612041 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-libvirt-combined-ca-bundle\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.612143 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf76z\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-kube-api-access-sf76z\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.612186 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-telemetry-combined-ca-bundle\") pod \"2a1df74b-db86-4291-9bbc-202200fb7f7b\" (UID: \"2a1df74b-db86-4291-9bbc-202200fb7f7b\") " Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.619638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.620260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.619557 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.622454 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.622554 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.622664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.626949 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-kube-api-access-sf76z" (OuterVolumeSpecName: "kube-api-access-sf76z") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "kube-api-access-sf76z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.627395 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.632409 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.637852 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.634679 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.638762 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.665592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-inventory" (OuterVolumeSpecName: "inventory") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.673202 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2a1df74b-db86-4291-9bbc-202200fb7f7b" (UID: "2a1df74b-db86-4291-9bbc-202200fb7f7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714732 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714788 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714804 4743 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714836 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714853 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714867 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714881 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714895 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714909 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714922 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf76z\" (UniqueName: \"kubernetes.io/projected/2a1df74b-db86-4291-9bbc-202200fb7f7b-kube-api-access-sf76z\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714936 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714948 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714960 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:11 crc kubenswrapper[4743]: I0310 15:41:11.714972 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1df74b-db86-4291-9bbc-202200fb7f7b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:11.999724 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:11.999691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk" event={"ID":"2a1df74b-db86-4291-9bbc-202200fb7f7b","Type":"ContainerDied","Data":"72b94e0c8afc2f5e19a394476eaa5f07bb9d0777801d22ed1d343814b74160ec"} Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.000449 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72b94e0c8afc2f5e19a394476eaa5f07bb9d0777801d22ed1d343814b74160ec" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.006411 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="af106cec6bede362dcfa7f972c69b1f3686e1c5bf8c7d44260bf5cc40751d829" exitCode=0 Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.006722 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"af106cec6bede362dcfa7f972c69b1f3686e1c5bf8c7d44260bf5cc40751d829"} Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.006911 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e"} Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.007037 4743 scope.go:117] "RemoveContainer" containerID="1495ebfcf9c77a88a5ff6969124ce145ae8addb2f0a4e726361059eec8da042d" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.146287 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5"] Mar 10 15:41:12 crc kubenswrapper[4743]: E0310 15:41:12.146798 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1df74b-db86-4291-9bbc-202200fb7f7b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.146887 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1df74b-db86-4291-9bbc-202200fb7f7b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.147118 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1df74b-db86-4291-9bbc-202200fb7f7b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.148687 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.151201 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.151852 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.153638 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.161095 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.161092 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.166623 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5"] Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.223374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.223501 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.223586 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.223946 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.223993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9lv\" (UniqueName: \"kubernetes.io/projected/b481f000-abd5-4ee6-8b39-301e66c22f2a-kube-api-access-bh9lv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.325992 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.327018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9lv\" (UniqueName: \"kubernetes.io/projected/b481f000-abd5-4ee6-8b39-301e66c22f2a-kube-api-access-bh9lv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.327176 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.327257 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.327320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.328329 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.332119 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.333103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.336642 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.346376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9lv\" (UniqueName: \"kubernetes.io/projected/b481f000-abd5-4ee6-8b39-301e66c22f2a-kube-api-access-bh9lv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pmgm5\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:12 crc kubenswrapper[4743]: I0310 15:41:12.478447 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:41:13 crc kubenswrapper[4743]: I0310 15:41:13.063197 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5"] Mar 10 15:41:13 crc kubenswrapper[4743]: W0310 15:41:13.065416 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb481f000_abd5_4ee6_8b39_301e66c22f2a.slice/crio-f5c382d7800f6271bb264796dc2d43916c077c92262a38c060b661836dbb55fc WatchSource:0}: Error finding container f5c382d7800f6271bb264796dc2d43916c077c92262a38c060b661836dbb55fc: Status 404 returned error can't find the container with id f5c382d7800f6271bb264796dc2d43916c077c92262a38c060b661836dbb55fc Mar 10 15:41:14 crc kubenswrapper[4743]: I0310 15:41:14.037238 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" event={"ID":"b481f000-abd5-4ee6-8b39-301e66c22f2a","Type":"ContainerStarted","Data":"3bd2374c8866c1ca7f1c011551ddb3b2d998d70a6f19f7c6960d9c0a08be5cb1"} Mar 10 15:41:14 crc kubenswrapper[4743]: I0310 15:41:14.038081 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" event={"ID":"b481f000-abd5-4ee6-8b39-301e66c22f2a","Type":"ContainerStarted","Data":"f5c382d7800f6271bb264796dc2d43916c077c92262a38c060b661836dbb55fc"} Mar 10 15:41:14 crc kubenswrapper[4743]: I0310 15:41:14.070545 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" podStartSLOduration=1.654728778 podStartE2EDuration="2.070524317s" podCreationTimestamp="2026-03-10 15:41:12 +0000 UTC" firstStartedPulling="2026-03-10 15:41:13.069469098 +0000 UTC m=+2137.776283856" lastFinishedPulling="2026-03-10 15:41:13.485264647 +0000 UTC m=+2138.192079395" observedRunningTime="2026-03-10 15:41:14.062412737 +0000 UTC m=+2138.769227475" watchObservedRunningTime="2026-03-10 15:41:14.070524317 +0000 UTC m=+2138.777339075" Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.168209 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552622-26d76"] Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.171659 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552622-26d76" Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.177129 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.177775 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.178016 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.186240 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552622-26d76"] Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.251996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6wz5\" (UniqueName: \"kubernetes.io/projected/0ca96e86-ba03-4721-a048-44952f8dd42c-kube-api-access-x6wz5\") pod \"auto-csr-approver-29552622-26d76\" (UID: \"0ca96e86-ba03-4721-a048-44952f8dd42c\") " pod="openshift-infra/auto-csr-approver-29552622-26d76" Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.354678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6wz5\" (UniqueName: \"kubernetes.io/projected/0ca96e86-ba03-4721-a048-44952f8dd42c-kube-api-access-x6wz5\") pod \"auto-csr-approver-29552622-26d76\" (UID: \"0ca96e86-ba03-4721-a048-44952f8dd42c\") " pod="openshift-infra/auto-csr-approver-29552622-26d76" Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.388114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6wz5\" (UniqueName: \"kubernetes.io/projected/0ca96e86-ba03-4721-a048-44952f8dd42c-kube-api-access-x6wz5\") pod \"auto-csr-approver-29552622-26d76\" (UID: \"0ca96e86-ba03-4721-a048-44952f8dd42c\") " pod="openshift-infra/auto-csr-approver-29552622-26d76" Mar 10 15:42:00 crc kubenswrapper[4743]: I0310 15:42:00.498930 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552622-26d76" Mar 10 15:42:01 crc kubenswrapper[4743]: I0310 15:42:01.005241 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552622-26d76"] Mar 10 15:42:01 crc kubenswrapper[4743]: I0310 15:42:01.522295 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552622-26d76" event={"ID":"0ca96e86-ba03-4721-a048-44952f8dd42c","Type":"ContainerStarted","Data":"6b7c04c99df52aa33badbc2e9389a3fac89f35590f44141ea66b8d99f30b8f9b"} Mar 10 15:42:03 crc kubenswrapper[4743]: I0310 15:42:03.540568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552622-26d76" event={"ID":"0ca96e86-ba03-4721-a048-44952f8dd42c","Type":"ContainerStarted","Data":"3438dd95ee26942bb61121df3ae0b6bed2c5b5648de8e37eeb9e8ea494a09e6f"} Mar 10 15:42:04 crc kubenswrapper[4743]: I0310 15:42:04.555830 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ca96e86-ba03-4721-a048-44952f8dd42c" containerID="3438dd95ee26942bb61121df3ae0b6bed2c5b5648de8e37eeb9e8ea494a09e6f" exitCode=0 Mar 10 15:42:04 crc kubenswrapper[4743]: I0310 15:42:04.555938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552622-26d76" event={"ID":"0ca96e86-ba03-4721-a048-44952f8dd42c","Type":"ContainerDied","Data":"3438dd95ee26942bb61121df3ae0b6bed2c5b5648de8e37eeb9e8ea494a09e6f"} Mar 10 15:42:04 crc kubenswrapper[4743]: I0310 15:42:04.961021 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552622-26d76" Mar 10 15:42:05 crc kubenswrapper[4743]: I0310 15:42:05.160557 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6wz5\" (UniqueName: \"kubernetes.io/projected/0ca96e86-ba03-4721-a048-44952f8dd42c-kube-api-access-x6wz5\") pod \"0ca96e86-ba03-4721-a048-44952f8dd42c\" (UID: \"0ca96e86-ba03-4721-a048-44952f8dd42c\") " Mar 10 15:42:05 crc kubenswrapper[4743]: I0310 15:42:05.172038 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca96e86-ba03-4721-a048-44952f8dd42c-kube-api-access-x6wz5" (OuterVolumeSpecName: "kube-api-access-x6wz5") pod "0ca96e86-ba03-4721-a048-44952f8dd42c" (UID: "0ca96e86-ba03-4721-a048-44952f8dd42c"). InnerVolumeSpecName "kube-api-access-x6wz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:42:05 crc kubenswrapper[4743]: I0310 15:42:05.263046 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6wz5\" (UniqueName: \"kubernetes.io/projected/0ca96e86-ba03-4721-a048-44952f8dd42c-kube-api-access-x6wz5\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:05 crc kubenswrapper[4743]: I0310 15:42:05.572311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552622-26d76" event={"ID":"0ca96e86-ba03-4721-a048-44952f8dd42c","Type":"ContainerDied","Data":"6b7c04c99df52aa33badbc2e9389a3fac89f35590f44141ea66b8d99f30b8f9b"} Mar 10 15:42:05 crc kubenswrapper[4743]: I0310 15:42:05.572400 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b7c04c99df52aa33badbc2e9389a3fac89f35590f44141ea66b8d99f30b8f9b" Mar 10 15:42:05 crc kubenswrapper[4743]: I0310 15:42:05.572410 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552622-26d76" Mar 10 15:42:06 crc kubenswrapper[4743]: I0310 15:42:06.055970 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552616-v5mqc"] Mar 10 15:42:06 crc kubenswrapper[4743]: I0310 15:42:06.065967 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552616-v5mqc"] Mar 10 15:42:07 crc kubenswrapper[4743]: I0310 15:42:07.933629 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1252848b-7af8-4167-aa0d-10efdc284e40" path="/var/lib/kubelet/pods/1252848b-7af8-4167-aa0d-10efdc284e40/volumes" Mar 10 15:42:17 crc kubenswrapper[4743]: I0310 15:42:17.689284 4743 generic.go:334] "Generic (PLEG): container finished" podID="b481f000-abd5-4ee6-8b39-301e66c22f2a" containerID="3bd2374c8866c1ca7f1c011551ddb3b2d998d70a6f19f7c6960d9c0a08be5cb1" exitCode=0 Mar 10 15:42:17 crc kubenswrapper[4743]: I0310 15:42:17.689359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" event={"ID":"b481f000-abd5-4ee6-8b39-301e66c22f2a","Type":"ContainerDied","Data":"3bd2374c8866c1ca7f1c011551ddb3b2d998d70a6f19f7c6960d9c0a08be5cb1"} Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.142823 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wl8pc"] Mar 10 15:42:19 crc kubenswrapper[4743]: E0310 15:42:19.143495 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca96e86-ba03-4721-a048-44952f8dd42c" containerName="oc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.143507 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca96e86-ba03-4721-a048-44952f8dd42c" containerName="oc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.143694 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca96e86-ba03-4721-a048-44952f8dd42c" containerName="oc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.145017 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.153149 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wl8pc"] Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.162342 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.282324 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh9lv\" (UniqueName: \"kubernetes.io/projected/b481f000-abd5-4ee6-8b39-301e66c22f2a-kube-api-access-bh9lv\") pod \"b481f000-abd5-4ee6-8b39-301e66c22f2a\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.282789 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-inventory\") pod \"b481f000-abd5-4ee6-8b39-301e66c22f2a\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.282930 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ssh-key-openstack-edpm-ipam\") pod \"b481f000-abd5-4ee6-8b39-301e66c22f2a\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.282967 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0\") pod \"b481f000-abd5-4ee6-8b39-301e66c22f2a\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.282991 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovn-combined-ca-bundle\") pod \"b481f000-abd5-4ee6-8b39-301e66c22f2a\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.283332 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-catalog-content\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.283444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-utilities\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.283487 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hnn5\" (UniqueName: \"kubernetes.io/projected/756d836d-5b53-450f-9e2c-d19324424feb-kube-api-access-9hnn5\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.288854 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b481f000-abd5-4ee6-8b39-301e66c22f2a-kube-api-access-bh9lv" (OuterVolumeSpecName: "kube-api-access-bh9lv") pod "b481f000-abd5-4ee6-8b39-301e66c22f2a" (UID: "b481f000-abd5-4ee6-8b39-301e66c22f2a"). InnerVolumeSpecName "kube-api-access-bh9lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.291403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b481f000-abd5-4ee6-8b39-301e66c22f2a" (UID: "b481f000-abd5-4ee6-8b39-301e66c22f2a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:42:19 crc kubenswrapper[4743]: E0310 15:42:19.318745 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0 podName:b481f000-abd5-4ee6-8b39-301e66c22f2a nodeName:}" failed. No retries permitted until 2026-03-10 15:42:19.818689983 +0000 UTC m=+2204.525504731 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovncontroller-config-0" (UniqueName: "kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0") pod "b481f000-abd5-4ee6-8b39-301e66c22f2a" (UID: "b481f000-abd5-4ee6-8b39-301e66c22f2a") : error deleting /var/lib/kubelet/pods/b481f000-abd5-4ee6-8b39-301e66c22f2a/volume-subpaths: remove /var/lib/kubelet/pods/b481f000-abd5-4ee6-8b39-301e66c22f2a/volume-subpaths: no such file or directory Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.323001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b481f000-abd5-4ee6-8b39-301e66c22f2a" (UID: "b481f000-abd5-4ee6-8b39-301e66c22f2a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.324367 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-inventory" (OuterVolumeSpecName: "inventory") pod "b481f000-abd5-4ee6-8b39-301e66c22f2a" (UID: "b481f000-abd5-4ee6-8b39-301e66c22f2a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.386220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hnn5\" (UniqueName: \"kubernetes.io/projected/756d836d-5b53-450f-9e2c-d19324424feb-kube-api-access-9hnn5\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.386366 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-catalog-content\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.386499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-utilities\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.386594 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh9lv\" (UniqueName: \"kubernetes.io/projected/b481f000-abd5-4ee6-8b39-301e66c22f2a-kube-api-access-bh9lv\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.386613 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.386631 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.386645 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.387227 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-utilities\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.387338 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-catalog-content\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.408913 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hnn5\" (UniqueName: \"kubernetes.io/projected/756d836d-5b53-450f-9e2c-d19324424feb-kube-api-access-9hnn5\") pod \"redhat-operators-wl8pc\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.477510 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.707957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" event={"ID":"b481f000-abd5-4ee6-8b39-301e66c22f2a","Type":"ContainerDied","Data":"f5c382d7800f6271bb264796dc2d43916c077c92262a38c060b661836dbb55fc"} Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.708027 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5c382d7800f6271bb264796dc2d43916c077c92262a38c060b661836dbb55fc" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.708144 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pmgm5" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.825006 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6"] Mar 10 15:42:19 crc kubenswrapper[4743]: E0310 15:42:19.825664 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b481f000-abd5-4ee6-8b39-301e66c22f2a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.825683 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b481f000-abd5-4ee6-8b39-301e66c22f2a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.825931 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b481f000-abd5-4ee6-8b39-301e66c22f2a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.826582 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.828583 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.834835 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.836563 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6"] Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.894353 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0\") pod \"b481f000-abd5-4ee6-8b39-301e66c22f2a\" (UID: \"b481f000-abd5-4ee6-8b39-301e66c22f2a\") " Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.894970 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.895002 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.895019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b481f000-abd5-4ee6-8b39-301e66c22f2a" (UID: "b481f000-abd5-4ee6-8b39-301e66c22f2a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.895177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.895500 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.895623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.896002 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2rt\" (UniqueName: \"kubernetes.io/projected/094b793a-799c-4f28-adb3-7caa9a6e732a-kube-api-access-pr2rt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.896142 4743 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b481f000-abd5-4ee6-8b39-301e66c22f2a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:19 crc kubenswrapper[4743]: I0310 15:42:19.992120 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wl8pc"] Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.003115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr2rt\" (UniqueName: \"kubernetes.io/projected/094b793a-799c-4f28-adb3-7caa9a6e732a-kube-api-access-pr2rt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.003212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.003237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.003298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.003419 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.003444 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.007905 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.008371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.013321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.014231 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.014543 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.021536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr2rt\" (UniqueName: \"kubernetes.io/projected/094b793a-799c-4f28-adb3-7caa9a6e732a-kube-api-access-pr2rt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.146653 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.726548 4743 generic.go:334] "Generic (PLEG): container finished" podID="756d836d-5b53-450f-9e2c-d19324424feb" containerID="bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f" exitCode=0 Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.726601 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wl8pc" event={"ID":"756d836d-5b53-450f-9e2c-d19324424feb","Type":"ContainerDied","Data":"bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f"} Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.726628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wl8pc" event={"ID":"756d836d-5b53-450f-9e2c-d19324424feb","Type":"ContainerStarted","Data":"ad4294c576beba30b6150c20f00888aee252298c738e30dab976929ce6505e29"} Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.732882 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:42:20 crc kubenswrapper[4743]: I0310 15:42:20.772861 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6"] Mar 10 15:42:20 crc kubenswrapper[4743]: W0310 15:42:20.787740 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod094b793a_799c_4f28_adb3_7caa9a6e732a.slice/crio-542f015486bf0e371a88d4d533daa9119781094804e8802775acf099d8bd5843 WatchSource:0}: Error finding container 542f015486bf0e371a88d4d533daa9119781094804e8802775acf099d8bd5843: Status 404 returned error can't find the container with id 542f015486bf0e371a88d4d533daa9119781094804e8802775acf099d8bd5843 Mar 10 15:42:21 crc kubenswrapper[4743]: I0310 15:42:21.738090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" event={"ID":"094b793a-799c-4f28-adb3-7caa9a6e732a","Type":"ContainerStarted","Data":"b41fda38a04de3cbdf12dab4758e5a1df0aee08f51ba648327453670df27ad7b"} Mar 10 15:42:21 crc kubenswrapper[4743]: I0310 15:42:21.738413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" event={"ID":"094b793a-799c-4f28-adb3-7caa9a6e732a","Type":"ContainerStarted","Data":"542f015486bf0e371a88d4d533daa9119781094804e8802775acf099d8bd5843"} Mar 10 15:42:21 crc kubenswrapper[4743]: I0310 15:42:21.762406 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" podStartSLOduration=2.174863277 podStartE2EDuration="2.762383726s" podCreationTimestamp="2026-03-10 15:42:19 +0000 UTC" firstStartedPulling="2026-03-10 15:42:20.790232265 +0000 UTC m=+2205.497047013" lastFinishedPulling="2026-03-10 15:42:21.377752704 +0000 UTC m=+2206.084567462" observedRunningTime="2026-03-10 15:42:21.753792362 +0000 UTC m=+2206.460607120" watchObservedRunningTime="2026-03-10 15:42:21.762383726 +0000 UTC m=+2206.469198484" Mar 10 15:42:22 crc kubenswrapper[4743]: I0310 15:42:22.757109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wl8pc" event={"ID":"756d836d-5b53-450f-9e2c-d19324424feb","Type":"ContainerStarted","Data":"1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b"} Mar 10 15:42:27 crc kubenswrapper[4743]: I0310 15:42:27.806940 4743 generic.go:334] "Generic (PLEG): container finished" podID="756d836d-5b53-450f-9e2c-d19324424feb" containerID="1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b" exitCode=0 Mar 10 15:42:27 crc kubenswrapper[4743]: I0310 15:42:27.807002 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wl8pc" event={"ID":"756d836d-5b53-450f-9e2c-d19324424feb","Type":"ContainerDied","Data":"1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b"} Mar 10 15:42:28 crc kubenswrapper[4743]: I0310 15:42:28.825701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wl8pc" event={"ID":"756d836d-5b53-450f-9e2c-d19324424feb","Type":"ContainerStarted","Data":"caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373"} Mar 10 15:42:28 crc kubenswrapper[4743]: I0310 15:42:28.859620 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wl8pc" podStartSLOduration=2.394330897 podStartE2EDuration="9.859599175s" podCreationTimestamp="2026-03-10 15:42:19 +0000 UTC" firstStartedPulling="2026-03-10 15:42:20.732630995 +0000 UTC m=+2205.439445743" lastFinishedPulling="2026-03-10 15:42:28.197899263 +0000 UTC m=+2212.904714021" observedRunningTime="2026-03-10 15:42:28.856129066 +0000 UTC m=+2213.562943814" watchObservedRunningTime="2026-03-10 15:42:28.859599175 +0000 UTC m=+2213.566413913" Mar 10 15:42:29 crc kubenswrapper[4743]: I0310 15:42:29.478111 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:29 crc kubenswrapper[4743]: I0310 15:42:29.478160 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:30 crc kubenswrapper[4743]: I0310 15:42:30.539092 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wl8pc" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="registry-server" probeResult="failure" output=< Mar 10 15:42:30 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 15:42:30 crc kubenswrapper[4743]: > Mar 10 15:42:40 crc kubenswrapper[4743]: I0310 15:42:40.523116 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wl8pc" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="registry-server" probeResult="failure" output=< Mar 10 15:42:40 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 15:42:40 crc kubenswrapper[4743]: > Mar 10 15:42:49 crc kubenswrapper[4743]: I0310 15:42:49.542488 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:49 crc kubenswrapper[4743]: I0310 15:42:49.597113 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:50 crc kubenswrapper[4743]: I0310 15:42:50.341296 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wl8pc"] Mar 10 15:42:51 crc kubenswrapper[4743]: I0310 15:42:51.448765 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wl8pc" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="registry-server" containerID="cri-o://caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373" gracePeriod=2 Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.026829 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.226926 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hnn5\" (UniqueName: \"kubernetes.io/projected/756d836d-5b53-450f-9e2c-d19324424feb-kube-api-access-9hnn5\") pod \"756d836d-5b53-450f-9e2c-d19324424feb\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.226977 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-utilities\") pod \"756d836d-5b53-450f-9e2c-d19324424feb\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.227281 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-catalog-content\") pod \"756d836d-5b53-450f-9e2c-d19324424feb\" (UID: \"756d836d-5b53-450f-9e2c-d19324424feb\") " Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.230190 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-utilities" (OuterVolumeSpecName: "utilities") pod "756d836d-5b53-450f-9e2c-d19324424feb" (UID: "756d836d-5b53-450f-9e2c-d19324424feb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.233378 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756d836d-5b53-450f-9e2c-d19324424feb-kube-api-access-9hnn5" (OuterVolumeSpecName: "kube-api-access-9hnn5") pod "756d836d-5b53-450f-9e2c-d19324424feb" (UID: "756d836d-5b53-450f-9e2c-d19324424feb"). InnerVolumeSpecName "kube-api-access-9hnn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.329472 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hnn5\" (UniqueName: \"kubernetes.io/projected/756d836d-5b53-450f-9e2c-d19324424feb-kube-api-access-9hnn5\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.329507 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.393153 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "756d836d-5b53-450f-9e2c-d19324424feb" (UID: "756d836d-5b53-450f-9e2c-d19324424feb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.431149 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756d836d-5b53-450f-9e2c-d19324424feb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.459949 4743 generic.go:334] "Generic (PLEG): container finished" podID="756d836d-5b53-450f-9e2c-d19324424feb" containerID="caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373" exitCode=0 Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.460001 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wl8pc" event={"ID":"756d836d-5b53-450f-9e2c-d19324424feb","Type":"ContainerDied","Data":"caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373"} Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.460017 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wl8pc" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.460055 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wl8pc" event={"ID":"756d836d-5b53-450f-9e2c-d19324424feb","Type":"ContainerDied","Data":"ad4294c576beba30b6150c20f00888aee252298c738e30dab976929ce6505e29"} Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.460079 4743 scope.go:117] "RemoveContainer" containerID="caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.499227 4743 scope.go:117] "RemoveContainer" containerID="1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.504903 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wl8pc"] Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.514941 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wl8pc"] Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.528281 4743 scope.go:117] "RemoveContainer" containerID="bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.591150 4743 scope.go:117] "RemoveContainer" containerID="caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373" Mar 10 15:42:52 crc kubenswrapper[4743]: E0310 15:42:52.595232 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373\": container with ID starting with caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373 not found: ID does not exist" containerID="caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.595287 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373"} err="failed to get container status \"caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373\": rpc error: code = NotFound desc = could not find container \"caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373\": container with ID starting with caddf99b4fd87fffb2d2798d51905a0d7ea669adf6696f5e9b3c33501b168373 not found: ID does not exist" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.595316 4743 scope.go:117] "RemoveContainer" containerID="1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b" Mar 10 15:42:52 crc kubenswrapper[4743]: E0310 15:42:52.597495 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b\": container with ID starting with 1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b not found: ID does not exist" containerID="1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.597533 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b"} err="failed to get container status \"1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b\": rpc error: code = NotFound desc = could not find container \"1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b\": container with ID starting with 1cb9518946947ddbefda9e2581b9dd3f561d50667427f9caadaeb4a79e11904b not found: ID does not exist" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.597554 4743 scope.go:117] "RemoveContainer" containerID="bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f" Mar 10 15:42:52 crc kubenswrapper[4743]: E0310 15:42:52.597962 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f\": container with ID starting with bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f not found: ID does not exist" containerID="bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f" Mar 10 15:42:52 crc kubenswrapper[4743]: I0310 15:42:52.597996 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f"} err="failed to get container status \"bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f\": rpc error: code = NotFound desc = could not find container \"bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f\": container with ID starting with bf7c945561ccb71e26d78e96c25449655273df6e9130a96e00229497d658233f not found: ID does not exist" Mar 10 15:42:53 crc kubenswrapper[4743]: I0310 15:42:53.967010 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756d836d-5b53-450f-9e2c-d19324424feb" path="/var/lib/kubelet/pods/756d836d-5b53-450f-9e2c-d19324424feb/volumes" Mar 10 15:42:55 crc kubenswrapper[4743]: I0310 15:42:55.700032 4743 scope.go:117] "RemoveContainer" containerID="b91085497b22c7f17b848c338c16f1870cf5d1dbc17b27c8adc3e9781a2d57b3" Mar 10 15:43:09 crc kubenswrapper[4743]: I0310 15:43:09.637694 4743 generic.go:334] "Generic (PLEG): container finished" podID="094b793a-799c-4f28-adb3-7caa9a6e732a" containerID="b41fda38a04de3cbdf12dab4758e5a1df0aee08f51ba648327453670df27ad7b" exitCode=0 Mar 10 15:43:09 crc kubenswrapper[4743]: I0310 15:43:09.638013 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" event={"ID":"094b793a-799c-4f28-adb3-7caa9a6e732a","Type":"ContainerDied","Data":"b41fda38a04de3cbdf12dab4758e5a1df0aee08f51ba648327453670df27ad7b"} Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.114131 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.206259 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr2rt\" (UniqueName: \"kubernetes.io/projected/094b793a-799c-4f28-adb3-7caa9a6e732a-kube-api-access-pr2rt\") pod \"094b793a-799c-4f28-adb3-7caa9a6e732a\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.207017 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"094b793a-799c-4f28-adb3-7caa9a6e732a\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.207233 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-nova-metadata-neutron-config-0\") pod \"094b793a-799c-4f28-adb3-7caa9a6e732a\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.207374 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-inventory\") pod \"094b793a-799c-4f28-adb3-7caa9a6e732a\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.207548 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-ssh-key-openstack-edpm-ipam\") pod \"094b793a-799c-4f28-adb3-7caa9a6e732a\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.207620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-metadata-combined-ca-bundle\") pod \"094b793a-799c-4f28-adb3-7caa9a6e732a\" (UID: \"094b793a-799c-4f28-adb3-7caa9a6e732a\") " Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.214132 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/094b793a-799c-4f28-adb3-7caa9a6e732a-kube-api-access-pr2rt" (OuterVolumeSpecName: "kube-api-access-pr2rt") pod "094b793a-799c-4f28-adb3-7caa9a6e732a" (UID: "094b793a-799c-4f28-adb3-7caa9a6e732a"). InnerVolumeSpecName "kube-api-access-pr2rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.230936 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "094b793a-799c-4f28-adb3-7caa9a6e732a" (UID: "094b793a-799c-4f28-adb3-7caa9a6e732a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.241029 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "094b793a-799c-4f28-adb3-7caa9a6e732a" (UID: "094b793a-799c-4f28-adb3-7caa9a6e732a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.248549 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "094b793a-799c-4f28-adb3-7caa9a6e732a" (UID: "094b793a-799c-4f28-adb3-7caa9a6e732a"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.252550 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.252601 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.253977 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "094b793a-799c-4f28-adb3-7caa9a6e732a" (UID: "094b793a-799c-4f28-adb3-7caa9a6e732a"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.257479 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-inventory" (OuterVolumeSpecName: "inventory") pod "094b793a-799c-4f28-adb3-7caa9a6e732a" (UID: "094b793a-799c-4f28-adb3-7caa9a6e732a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.311365 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.311405 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.311417 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.311426 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.311435 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094b793a-799c-4f28-adb3-7caa9a6e732a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.311445 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr2rt\" (UniqueName: \"kubernetes.io/projected/094b793a-799c-4f28-adb3-7caa9a6e732a-kube-api-access-pr2rt\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.667068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" event={"ID":"094b793a-799c-4f28-adb3-7caa9a6e732a","Type":"ContainerDied","Data":"542f015486bf0e371a88d4d533daa9119781094804e8802775acf099d8bd5843"} Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.667146 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="542f015486bf0e371a88d4d533daa9119781094804e8802775acf099d8bd5843" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.667170 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.809852 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m"] Mar 10 15:43:11 crc kubenswrapper[4743]: E0310 15:43:11.810388 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="registry-server" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.810409 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="registry-server" Mar 10 15:43:11 crc kubenswrapper[4743]: E0310 15:43:11.810428 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="extract-content" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.810436 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="extract-content" Mar 10 15:43:11 crc kubenswrapper[4743]: E0310 15:43:11.810459 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="extract-utilities" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.810466 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="extract-utilities" Mar 10 15:43:11 crc kubenswrapper[4743]: E0310 15:43:11.810473 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094b793a-799c-4f28-adb3-7caa9a6e732a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.810480 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="094b793a-799c-4f28-adb3-7caa9a6e732a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.810665 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="094b793a-799c-4f28-adb3-7caa9a6e732a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.810688 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="756d836d-5b53-450f-9e2c-d19324424feb" containerName="registry-server" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.811377 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.813365 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.815540 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.815889 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.816551 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.817293 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.843544 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m"] Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.932608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.932660 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.932682 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.932852 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbqp\" (UniqueName: \"kubernetes.io/projected/b982c5ef-116d-4e18-a707-768c7f0fbfc0-kube-api-access-lwbqp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:11 crc kubenswrapper[4743]: I0310 15:43:11.932907 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.035471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.036762 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.036806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.036896 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbqp\" (UniqueName: \"kubernetes.io/projected/b982c5ef-116d-4e18-a707-768c7f0fbfc0-kube-api-access-lwbqp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.036935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.042100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.042231 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.042946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.044163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.060046 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbqp\" (UniqueName: \"kubernetes.io/projected/b982c5ef-116d-4e18-a707-768c7f0fbfc0-kube-api-access-lwbqp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fd24m\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.144976 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:43:12 crc kubenswrapper[4743]: I0310 15:43:12.730202 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m"] Mar 10 15:43:13 crc kubenswrapper[4743]: I0310 15:43:13.689787 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" event={"ID":"b982c5ef-116d-4e18-a707-768c7f0fbfc0","Type":"ContainerStarted","Data":"158e1783698d99efdeaa8127e0cdc981fcfdc8da7f3830a96ab52c64f5792d7b"} Mar 10 15:43:13 crc kubenswrapper[4743]: I0310 15:43:13.690142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" event={"ID":"b982c5ef-116d-4e18-a707-768c7f0fbfc0","Type":"ContainerStarted","Data":"aed052d005af8b0636b499ed9cb45f532360d2b55d23f7be4fd365f99e96efa4"} Mar 10 15:43:13 crc kubenswrapper[4743]: I0310 15:43:13.745477 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" podStartSLOduration=2.179205474 podStartE2EDuration="2.745444088s" podCreationTimestamp="2026-03-10 15:43:11 +0000 UTC" firstStartedPulling="2026-03-10 15:43:12.740456471 +0000 UTC m=+2257.447271229" lastFinishedPulling="2026-03-10 15:43:13.306695055 +0000 UTC m=+2258.013509843" observedRunningTime="2026-03-10 15:43:13.712328105 +0000 UTC m=+2258.419142883" watchObservedRunningTime="2026-03-10 15:43:13.745444088 +0000 UTC m=+2258.452258876" Mar 10 15:43:22 crc kubenswrapper[4743]: I0310 15:43:22.945923 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9w6j9"] Mar 10 15:43:22 crc kubenswrapper[4743]: I0310 15:43:22.951975 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:22 crc kubenswrapper[4743]: I0310 15:43:22.985257 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w6j9"] Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.081444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shrwd\" (UniqueName: \"kubernetes.io/projected/88789c46-c619-4fb1-a76a-1763c8f34eb2-kube-api-access-shrwd\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.081533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-catalog-content\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.081576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-utilities\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.183752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shrwd\" (UniqueName: \"kubernetes.io/projected/88789c46-c619-4fb1-a76a-1763c8f34eb2-kube-api-access-shrwd\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.183860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-catalog-content\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.183892 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-utilities\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.184343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-catalog-content\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.184394 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-utilities\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.217696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shrwd\" (UniqueName: \"kubernetes.io/projected/88789c46-c619-4fb1-a76a-1763c8f34eb2-kube-api-access-shrwd\") pod \"community-operators-9w6j9\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.304405 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:23 crc kubenswrapper[4743]: I0310 15:43:23.850890 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w6j9"] Mar 10 15:43:24 crc kubenswrapper[4743]: I0310 15:43:24.809367 4743 generic.go:334] "Generic (PLEG): container finished" podID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerID="5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8" exitCode=0 Mar 10 15:43:24 crc kubenswrapper[4743]: I0310 15:43:24.810943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w6j9" event={"ID":"88789c46-c619-4fb1-a76a-1763c8f34eb2","Type":"ContainerDied","Data":"5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8"} Mar 10 15:43:24 crc kubenswrapper[4743]: I0310 15:43:24.811052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w6j9" event={"ID":"88789c46-c619-4fb1-a76a-1763c8f34eb2","Type":"ContainerStarted","Data":"ac98ab9e377b3076466b644556a6616af38deb2d288ba85ab9c32bf1126987a5"} Mar 10 15:43:25 crc kubenswrapper[4743]: I0310 15:43:25.824431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w6j9" event={"ID":"88789c46-c619-4fb1-a76a-1763c8f34eb2","Type":"ContainerStarted","Data":"98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1"} Mar 10 15:43:27 crc kubenswrapper[4743]: I0310 15:43:27.845223 4743 generic.go:334] "Generic (PLEG): container finished" podID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerID="98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1" exitCode=0 Mar 10 15:43:27 crc kubenswrapper[4743]: I0310 15:43:27.845303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w6j9" event={"ID":"88789c46-c619-4fb1-a76a-1763c8f34eb2","Type":"ContainerDied","Data":"98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1"} Mar 10 15:43:28 crc kubenswrapper[4743]: I0310 15:43:28.860331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w6j9" event={"ID":"88789c46-c619-4fb1-a76a-1763c8f34eb2","Type":"ContainerStarted","Data":"297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154"} Mar 10 15:43:28 crc kubenswrapper[4743]: I0310 15:43:28.891837 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9w6j9" podStartSLOduration=3.33489257 podStartE2EDuration="6.89179638s" podCreationTimestamp="2026-03-10 15:43:22 +0000 UTC" firstStartedPulling="2026-03-10 15:43:24.815168271 +0000 UTC m=+2269.521983039" lastFinishedPulling="2026-03-10 15:43:28.372072091 +0000 UTC m=+2273.078886849" observedRunningTime="2026-03-10 15:43:28.881554358 +0000 UTC m=+2273.588369126" watchObservedRunningTime="2026-03-10 15:43:28.89179638 +0000 UTC m=+2273.598611118" Mar 10 15:43:33 crc kubenswrapper[4743]: I0310 15:43:33.305092 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:33 crc kubenswrapper[4743]: I0310 15:43:33.305344 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:33 crc kubenswrapper[4743]: I0310 15:43:33.366377 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:33 crc kubenswrapper[4743]: I0310 15:43:33.973344 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:34 crc kubenswrapper[4743]: I0310 15:43:34.030956 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w6j9"] Mar 10 15:43:35 crc kubenswrapper[4743]: I0310 15:43:35.931693 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9w6j9" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerName="registry-server" containerID="cri-o://297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154" gracePeriod=2 Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.465666 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.611161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-catalog-content\") pod \"88789c46-c619-4fb1-a76a-1763c8f34eb2\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.611247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-utilities\") pod \"88789c46-c619-4fb1-a76a-1763c8f34eb2\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.612521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-utilities" (OuterVolumeSpecName: "utilities") pod "88789c46-c619-4fb1-a76a-1763c8f34eb2" (UID: "88789c46-c619-4fb1-a76a-1763c8f34eb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.612616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shrwd\" (UniqueName: \"kubernetes.io/projected/88789c46-c619-4fb1-a76a-1763c8f34eb2-kube-api-access-shrwd\") pod \"88789c46-c619-4fb1-a76a-1763c8f34eb2\" (UID: \"88789c46-c619-4fb1-a76a-1763c8f34eb2\") " Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.613315 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.622288 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88789c46-c619-4fb1-a76a-1763c8f34eb2-kube-api-access-shrwd" (OuterVolumeSpecName: "kube-api-access-shrwd") pod "88789c46-c619-4fb1-a76a-1763c8f34eb2" (UID: "88789c46-c619-4fb1-a76a-1763c8f34eb2"). InnerVolumeSpecName "kube-api-access-shrwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.666293 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88789c46-c619-4fb1-a76a-1763c8f34eb2" (UID: "88789c46-c619-4fb1-a76a-1763c8f34eb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.716397 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88789c46-c619-4fb1-a76a-1763c8f34eb2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.716476 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shrwd\" (UniqueName: \"kubernetes.io/projected/88789c46-c619-4fb1-a76a-1763c8f34eb2-kube-api-access-shrwd\") on node \"crc\" DevicePath \"\"" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.939495 4743 generic.go:334] "Generic (PLEG): container finished" podID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerID="297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154" exitCode=0 Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.939545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w6j9" event={"ID":"88789c46-c619-4fb1-a76a-1763c8f34eb2","Type":"ContainerDied","Data":"297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154"} Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.939606 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w6j9" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.940802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w6j9" event={"ID":"88789c46-c619-4fb1-a76a-1763c8f34eb2","Type":"ContainerDied","Data":"ac98ab9e377b3076466b644556a6616af38deb2d288ba85ab9c32bf1126987a5"} Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.940844 4743 scope.go:117] "RemoveContainer" containerID="297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154" Mar 10 15:43:36 crc kubenswrapper[4743]: I0310 15:43:36.989299 4743 scope.go:117] "RemoveContainer" containerID="98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1" Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.006868 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w6j9"] Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.021241 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9w6j9"] Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.048301 4743 scope.go:117] "RemoveContainer" containerID="5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8" Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.086298 4743 scope.go:117] "RemoveContainer" containerID="297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154" Mar 10 15:43:37 crc kubenswrapper[4743]: E0310 15:43:37.087202 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154\": container with ID starting with 297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154 not found: ID does not exist" containerID="297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154" Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.087241 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154"} err="failed to get container status \"297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154\": rpc error: code = NotFound desc = could not find container \"297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154\": container with ID starting with 297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154 not found: ID does not exist" Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.087268 4743 scope.go:117] "RemoveContainer" containerID="98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1" Mar 10 15:43:37 crc kubenswrapper[4743]: E0310 15:43:37.087694 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1\": container with ID starting with 98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1 not found: ID does not exist" containerID="98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1" Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.087770 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1"} err="failed to get container status \"98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1\": rpc error: code = NotFound desc = could not find container \"98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1\": container with ID starting with 98fa83502cf0911b50e3668330d7560ef03ac970c51be7f682af86116200d9e1 not found: ID does not exist" Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.087880 4743 scope.go:117] "RemoveContainer" containerID="5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8" Mar 10 15:43:37 crc kubenswrapper[4743]: E0310 15:43:37.088297 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8\": container with ID starting with 5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8 not found: ID does not exist" containerID="5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8" Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.088328 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8"} err="failed to get container status \"5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8\": rpc error: code = NotFound desc = could not find container \"5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8\": container with ID starting with 5e704f5b5db9f534f9069babeb56d143cff806ec1f1af52aca4a4def4b2f60c8 not found: ID does not exist" Mar 10 15:43:37 crc kubenswrapper[4743]: I0310 15:43:37.928969 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" path="/var/lib/kubelet/pods/88789c46-c619-4fb1-a76a-1763c8f34eb2/volumes" Mar 10 15:43:39 crc kubenswrapper[4743]: E0310 15:43:39.875024 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88789c46_c619_4fb1_a76a_1763c8f34eb2.slice/crio-297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:43:41 crc kubenswrapper[4743]: I0310 15:43:41.252570 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:43:41 crc kubenswrapper[4743]: I0310 15:43:41.252950 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:43:50 crc kubenswrapper[4743]: E0310 15:43:50.109434 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88789c46_c619_4fb1_a76a_1763c8f34eb2.slice/crio-297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.161469 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552624-6cthv"] Mar 10 15:44:00 crc kubenswrapper[4743]: E0310 15:44:00.162650 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerName="registry-server" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.162666 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerName="registry-server" Mar 10 15:44:00 crc kubenswrapper[4743]: E0310 15:44:00.162691 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerName="extract-utilities" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.162699 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerName="extract-utilities" Mar 10 15:44:00 crc kubenswrapper[4743]: E0310 15:44:00.162726 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerName="extract-content" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.162732 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerName="extract-content" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.162961 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="88789c46-c619-4fb1-a76a-1763c8f34eb2" containerName="registry-server" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.163613 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552624-6cthv" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.166776 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.167116 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.167343 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.187758 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552624-6cthv"] Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.250994 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxp75\" (UniqueName: \"kubernetes.io/projected/d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f-kube-api-access-fxp75\") pod \"auto-csr-approver-29552624-6cthv\" (UID: \"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f\") " pod="openshift-infra/auto-csr-approver-29552624-6cthv" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.352687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxp75\" (UniqueName: \"kubernetes.io/projected/d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f-kube-api-access-fxp75\") pod \"auto-csr-approver-29552624-6cthv\" (UID: \"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f\") " pod="openshift-infra/auto-csr-approver-29552624-6cthv" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.375697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxp75\" (UniqueName: \"kubernetes.io/projected/d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f-kube-api-access-fxp75\") pod \"auto-csr-approver-29552624-6cthv\" (UID: \"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f\") " pod="openshift-infra/auto-csr-approver-29552624-6cthv" Mar 10 15:44:00 crc kubenswrapper[4743]: E0310 15:44:00.399028 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88789c46_c619_4fb1_a76a_1763c8f34eb2.slice/crio-297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.491625 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552624-6cthv" Mar 10 15:44:00 crc kubenswrapper[4743]: I0310 15:44:00.757210 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552624-6cthv"] Mar 10 15:44:01 crc kubenswrapper[4743]: I0310 15:44:01.187265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552624-6cthv" event={"ID":"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f","Type":"ContainerStarted","Data":"6871dc52de6805698948ac7445bc6e0dc20748207cbc3a40cc0dc68588d01a39"} Mar 10 15:44:02 crc kubenswrapper[4743]: I0310 15:44:02.200122 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552624-6cthv" event={"ID":"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f","Type":"ContainerStarted","Data":"b701b41e53de5bf1b014e40c38f31c42770b896c100b6d7564f7d61e9c1cae1b"} Mar 10 15:44:02 crc kubenswrapper[4743]: I0310 15:44:02.218546 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552624-6cthv" podStartSLOduration=1.104950857 podStartE2EDuration="2.218527755s" podCreationTimestamp="2026-03-10 15:44:00 +0000 UTC" firstStartedPulling="2026-03-10 15:44:00.766385546 +0000 UTC m=+2305.473200304" lastFinishedPulling="2026-03-10 15:44:01.879962454 +0000 UTC m=+2306.586777202" observedRunningTime="2026-03-10 15:44:02.214520441 +0000 UTC m=+2306.921335189" watchObservedRunningTime="2026-03-10 15:44:02.218527755 +0000 UTC m=+2306.925342503" Mar 10 15:44:03 crc kubenswrapper[4743]: I0310 15:44:03.211410 4743 generic.go:334] "Generic (PLEG): container finished" podID="d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f" containerID="b701b41e53de5bf1b014e40c38f31c42770b896c100b6d7564f7d61e9c1cae1b" exitCode=0 Mar 10 15:44:03 crc kubenswrapper[4743]: I0310 15:44:03.211459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552624-6cthv" event={"ID":"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f","Type":"ContainerDied","Data":"b701b41e53de5bf1b014e40c38f31c42770b896c100b6d7564f7d61e9c1cae1b"} Mar 10 15:44:04 crc kubenswrapper[4743]: I0310 15:44:04.553181 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552624-6cthv" Mar 10 15:44:04 crc kubenswrapper[4743]: I0310 15:44:04.664619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxp75\" (UniqueName: \"kubernetes.io/projected/d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f-kube-api-access-fxp75\") pod \"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f\" (UID: \"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f\") " Mar 10 15:44:04 crc kubenswrapper[4743]: I0310 15:44:04.673454 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f-kube-api-access-fxp75" (OuterVolumeSpecName: "kube-api-access-fxp75") pod "d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f" (UID: "d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f"). InnerVolumeSpecName "kube-api-access-fxp75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:44:04 crc kubenswrapper[4743]: I0310 15:44:04.767416 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxp75\" (UniqueName: \"kubernetes.io/projected/d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f-kube-api-access-fxp75\") on node \"crc\" DevicePath \"\"" Mar 10 15:44:05 crc kubenswrapper[4743]: I0310 15:44:05.228947 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552624-6cthv" event={"ID":"d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f","Type":"ContainerDied","Data":"6871dc52de6805698948ac7445bc6e0dc20748207cbc3a40cc0dc68588d01a39"} Mar 10 15:44:05 crc kubenswrapper[4743]: I0310 15:44:05.228989 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6871dc52de6805698948ac7445bc6e0dc20748207cbc3a40cc0dc68588d01a39" Mar 10 15:44:05 crc kubenswrapper[4743]: I0310 15:44:05.229026 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552624-6cthv" Mar 10 15:44:05 crc kubenswrapper[4743]: I0310 15:44:05.293012 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552618-hrmvq"] Mar 10 15:44:05 crc kubenswrapper[4743]: I0310 15:44:05.301030 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552618-hrmvq"] Mar 10 15:44:05 crc kubenswrapper[4743]: I0310 15:44:05.938469 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85dfcae8-b723-4e7c-9686-25f8c2b71a4c" path="/var/lib/kubelet/pods/85dfcae8-b723-4e7c-9686-25f8c2b71a4c/volumes" Mar 10 15:44:10 crc kubenswrapper[4743]: E0310 15:44:10.687977 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88789c46_c619_4fb1_a76a_1763c8f34eb2.slice/crio-297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:44:11 crc kubenswrapper[4743]: I0310 15:44:11.253014 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:44:11 crc kubenswrapper[4743]: I0310 15:44:11.253111 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:44:11 crc kubenswrapper[4743]: I0310 15:44:11.253187 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:44:11 crc kubenswrapper[4743]: I0310 15:44:11.254350 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:44:11 crc kubenswrapper[4743]: I0310 15:44:11.254420 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" gracePeriod=600 Mar 10 15:44:11 crc kubenswrapper[4743]: E0310 15:44:11.381642 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:44:12 crc kubenswrapper[4743]: I0310 15:44:12.301837 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" exitCode=0 Mar 10 15:44:12 crc kubenswrapper[4743]: I0310 15:44:12.301923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e"} Mar 10 15:44:12 crc kubenswrapper[4743]: I0310 15:44:12.302472 4743 scope.go:117] "RemoveContainer" containerID="af106cec6bede362dcfa7f972c69b1f3686e1c5bf8c7d44260bf5cc40751d829" Mar 10 15:44:12 crc kubenswrapper[4743]: I0310 15:44:12.303477 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:44:12 crc kubenswrapper[4743]: E0310 15:44:12.304020 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:44:20 crc kubenswrapper[4743]: E0310 15:44:20.955143 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88789c46_c619_4fb1_a76a_1763c8f34eb2.slice/crio-297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:44:22 crc kubenswrapper[4743]: I0310 15:44:22.916842 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:44:22 crc kubenswrapper[4743]: E0310 15:44:22.917585 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:44:31 crc kubenswrapper[4743]: E0310 15:44:31.256896 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88789c46_c619_4fb1_a76a_1763c8f34eb2.slice/crio-297343178b6e0e1f7b3a4334f8a870e0996ccf247e52ff62fc7f436b7f561154.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:44:37 crc kubenswrapper[4743]: I0310 15:44:37.916370 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:44:37 crc kubenswrapper[4743]: E0310 15:44:37.917525 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:44:48 crc kubenswrapper[4743]: I0310 15:44:48.917072 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:44:48 crc kubenswrapper[4743]: E0310 15:44:48.919242 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.502379 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mjqzl"] Mar 10 15:44:53 crc kubenswrapper[4743]: E0310 15:44:53.503470 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f" containerName="oc" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.503492 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f" containerName="oc" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.503721 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f" containerName="oc" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.505458 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.521434 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjqzl"] Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.601207 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-catalog-content\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.601314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-utilities\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.601414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4g5k\" (UniqueName: \"kubernetes.io/projected/ce1a061a-96b8-4f40-983f-a3d145316862-kube-api-access-q4g5k\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.703122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-utilities\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.703565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4g5k\" (UniqueName: \"kubernetes.io/projected/ce1a061a-96b8-4f40-983f-a3d145316862-kube-api-access-q4g5k\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.703760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-catalog-content\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.703656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-utilities\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.704112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-catalog-content\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.732449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4g5k\" (UniqueName: \"kubernetes.io/projected/ce1a061a-96b8-4f40-983f-a3d145316862-kube-api-access-q4g5k\") pod \"redhat-marketplace-mjqzl\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:53 crc kubenswrapper[4743]: I0310 15:44:53.831078 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:44:54 crc kubenswrapper[4743]: I0310 15:44:54.334753 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjqzl"] Mar 10 15:44:54 crc kubenswrapper[4743]: I0310 15:44:54.849330 4743 generic.go:334] "Generic (PLEG): container finished" podID="ce1a061a-96b8-4f40-983f-a3d145316862" containerID="26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17" exitCode=0 Mar 10 15:44:54 crc kubenswrapper[4743]: I0310 15:44:54.849377 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjqzl" event={"ID":"ce1a061a-96b8-4f40-983f-a3d145316862","Type":"ContainerDied","Data":"26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17"} Mar 10 15:44:54 crc kubenswrapper[4743]: I0310 15:44:54.849401 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjqzl" event={"ID":"ce1a061a-96b8-4f40-983f-a3d145316862","Type":"ContainerStarted","Data":"cb152577c5169a23ed16be6fb5049e3abfcbdd767a3054e6e7800171de51aa13"} Mar 10 15:44:55 crc kubenswrapper[4743]: I0310 15:44:55.858760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjqzl" event={"ID":"ce1a061a-96b8-4f40-983f-a3d145316862","Type":"ContainerStarted","Data":"c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b"} Mar 10 15:44:55 crc kubenswrapper[4743]: I0310 15:44:55.863148 4743 scope.go:117] "RemoveContainer" containerID="e4a836464e6c7c8129fe4a476121c268334b2a07a7c7893af23b9029b67787ef" Mar 10 15:44:56 crc kubenswrapper[4743]: I0310 15:44:56.871892 4743 generic.go:334] "Generic (PLEG): container finished" podID="ce1a061a-96b8-4f40-983f-a3d145316862" containerID="c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b" exitCode=0 Mar 10 15:44:56 crc kubenswrapper[4743]: I0310 15:44:56.872053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjqzl" event={"ID":"ce1a061a-96b8-4f40-983f-a3d145316862","Type":"ContainerDied","Data":"c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b"} Mar 10 15:44:57 crc kubenswrapper[4743]: I0310 15:44:57.888156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjqzl" event={"ID":"ce1a061a-96b8-4f40-983f-a3d145316862","Type":"ContainerStarted","Data":"7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990"} Mar 10 15:44:57 crc kubenswrapper[4743]: I0310 15:44:57.913252 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mjqzl" podStartSLOduration=2.448968614 podStartE2EDuration="4.913232772s" podCreationTimestamp="2026-03-10 15:44:53 +0000 UTC" firstStartedPulling="2026-03-10 15:44:54.85143469 +0000 UTC m=+2359.558249438" lastFinishedPulling="2026-03-10 15:44:57.315698848 +0000 UTC m=+2362.022513596" observedRunningTime="2026-03-10 15:44:57.907629802 +0000 UTC m=+2362.614444590" watchObservedRunningTime="2026-03-10 15:44:57.913232772 +0000 UTC m=+2362.620047530" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.168774 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq"] Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.172609 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.176374 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.176776 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.189726 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq"] Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.238767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v76m8\" (UniqueName: \"kubernetes.io/projected/054edb77-7d07-4c2b-adf6-c50909d6dc2b-kube-api-access-v76m8\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.238985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/054edb77-7d07-4c2b-adf6-c50909d6dc2b-config-volume\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.239251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/054edb77-7d07-4c2b-adf6-c50909d6dc2b-secret-volume\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.341638 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/054edb77-7d07-4c2b-adf6-c50909d6dc2b-secret-volume\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.341733 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v76m8\" (UniqueName: \"kubernetes.io/projected/054edb77-7d07-4c2b-adf6-c50909d6dc2b-kube-api-access-v76m8\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.341806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/054edb77-7d07-4c2b-adf6-c50909d6dc2b-config-volume\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.342753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/054edb77-7d07-4c2b-adf6-c50909d6dc2b-config-volume\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.350512 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/054edb77-7d07-4c2b-adf6-c50909d6dc2b-secret-volume\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.374091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v76m8\" (UniqueName: \"kubernetes.io/projected/054edb77-7d07-4c2b-adf6-c50909d6dc2b-kube-api-access-v76m8\") pod \"collect-profiles-29552625-qsbsq\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.495242 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.915653 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:45:00 crc kubenswrapper[4743]: E0310 15:45:00.916593 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:45:00 crc kubenswrapper[4743]: W0310 15:45:00.983157 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod054edb77_7d07_4c2b_adf6_c50909d6dc2b.slice/crio-c9b2bc539b40529d7d9d7a22bc35d61956b3ecafea6cfa824c3b75cd3c061b27 WatchSource:0}: Error finding container c9b2bc539b40529d7d9d7a22bc35d61956b3ecafea6cfa824c3b75cd3c061b27: Status 404 returned error can't find the container with id c9b2bc539b40529d7d9d7a22bc35d61956b3ecafea6cfa824c3b75cd3c061b27 Mar 10 15:45:00 crc kubenswrapper[4743]: I0310 15:45:00.984945 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq"] Mar 10 15:45:01 crc kubenswrapper[4743]: I0310 15:45:01.934794 4743 generic.go:334] "Generic (PLEG): container finished" podID="054edb77-7d07-4c2b-adf6-c50909d6dc2b" containerID="d76631c845312f3b1b7694f669589e294c5e941eda404c319e3c3fe45e8a40a4" exitCode=0 Mar 10 15:45:01 crc kubenswrapper[4743]: I0310 15:45:01.935420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" event={"ID":"054edb77-7d07-4c2b-adf6-c50909d6dc2b","Type":"ContainerDied","Data":"d76631c845312f3b1b7694f669589e294c5e941eda404c319e3c3fe45e8a40a4"} Mar 10 15:45:01 crc kubenswrapper[4743]: I0310 15:45:01.935454 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" event={"ID":"054edb77-7d07-4c2b-adf6-c50909d6dc2b","Type":"ContainerStarted","Data":"c9b2bc539b40529d7d9d7a22bc35d61956b3ecafea6cfa824c3b75cd3c061b27"} Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.340541 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.420014 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v76m8\" (UniqueName: \"kubernetes.io/projected/054edb77-7d07-4c2b-adf6-c50909d6dc2b-kube-api-access-v76m8\") pod \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.420520 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/054edb77-7d07-4c2b-adf6-c50909d6dc2b-config-volume\") pod \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.420579 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/054edb77-7d07-4c2b-adf6-c50909d6dc2b-secret-volume\") pod \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\" (UID: \"054edb77-7d07-4c2b-adf6-c50909d6dc2b\") " Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.421281 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/054edb77-7d07-4c2b-adf6-c50909d6dc2b-config-volume" (OuterVolumeSpecName: "config-volume") pod "054edb77-7d07-4c2b-adf6-c50909d6dc2b" (UID: "054edb77-7d07-4c2b-adf6-c50909d6dc2b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.421697 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/054edb77-7d07-4c2b-adf6-c50909d6dc2b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.428947 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054edb77-7d07-4c2b-adf6-c50909d6dc2b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "054edb77-7d07-4c2b-adf6-c50909d6dc2b" (UID: "054edb77-7d07-4c2b-adf6-c50909d6dc2b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.429391 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054edb77-7d07-4c2b-adf6-c50909d6dc2b-kube-api-access-v76m8" (OuterVolumeSpecName: "kube-api-access-v76m8") pod "054edb77-7d07-4c2b-adf6-c50909d6dc2b" (UID: "054edb77-7d07-4c2b-adf6-c50909d6dc2b"). InnerVolumeSpecName "kube-api-access-v76m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.524240 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v76m8\" (UniqueName: \"kubernetes.io/projected/054edb77-7d07-4c2b-adf6-c50909d6dc2b-kube-api-access-v76m8\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.524279 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/054edb77-7d07-4c2b-adf6-c50909d6dc2b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.831660 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.831737 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.892576 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.974141 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" event={"ID":"054edb77-7d07-4c2b-adf6-c50909d6dc2b","Type":"ContainerDied","Data":"c9b2bc539b40529d7d9d7a22bc35d61956b3ecafea6cfa824c3b75cd3c061b27"} Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.974189 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b2bc539b40529d7d9d7a22bc35d61956b3ecafea6cfa824c3b75cd3c061b27" Mar 10 15:45:03 crc kubenswrapper[4743]: I0310 15:45:03.974155 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq" Mar 10 15:45:04 crc kubenswrapper[4743]: I0310 15:45:04.088156 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:45:04 crc kubenswrapper[4743]: I0310 15:45:04.144794 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjqzl"] Mar 10 15:45:04 crc kubenswrapper[4743]: I0310 15:45:04.427330 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7"] Mar 10 15:45:04 crc kubenswrapper[4743]: I0310 15:45:04.434993 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-l25l7"] Mar 10 15:45:05 crc kubenswrapper[4743]: I0310 15:45:05.940048 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89" path="/var/lib/kubelet/pods/e30bc6f1-5904-4007-b8e9-6c9cfb9f1d89/volumes" Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.006995 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mjqzl" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" containerName="registry-server" containerID="cri-o://7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990" gracePeriod=2 Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.462434 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.600539 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4g5k\" (UniqueName: \"kubernetes.io/projected/ce1a061a-96b8-4f40-983f-a3d145316862-kube-api-access-q4g5k\") pod \"ce1a061a-96b8-4f40-983f-a3d145316862\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.600838 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-utilities\") pod \"ce1a061a-96b8-4f40-983f-a3d145316862\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.601038 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-catalog-content\") pod \"ce1a061a-96b8-4f40-983f-a3d145316862\" (UID: \"ce1a061a-96b8-4f40-983f-a3d145316862\") " Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.601651 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-utilities" (OuterVolumeSpecName: "utilities") pod "ce1a061a-96b8-4f40-983f-a3d145316862" (UID: "ce1a061a-96b8-4f40-983f-a3d145316862"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.610952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce1a061a-96b8-4f40-983f-a3d145316862-kube-api-access-q4g5k" (OuterVolumeSpecName: "kube-api-access-q4g5k") pod "ce1a061a-96b8-4f40-983f-a3d145316862" (UID: "ce1a061a-96b8-4f40-983f-a3d145316862"). InnerVolumeSpecName "kube-api-access-q4g5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.627743 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce1a061a-96b8-4f40-983f-a3d145316862" (UID: "ce1a061a-96b8-4f40-983f-a3d145316862"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.702764 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4g5k\" (UniqueName: \"kubernetes.io/projected/ce1a061a-96b8-4f40-983f-a3d145316862-kube-api-access-q4g5k\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.702792 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:06 crc kubenswrapper[4743]: I0310 15:45:06.702835 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce1a061a-96b8-4f40-983f-a3d145316862-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.039353 4743 generic.go:334] "Generic (PLEG): container finished" podID="ce1a061a-96b8-4f40-983f-a3d145316862" containerID="7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990" exitCode=0 Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.039455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjqzl" event={"ID":"ce1a061a-96b8-4f40-983f-a3d145316862","Type":"ContainerDied","Data":"7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990"} Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.039522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjqzl" event={"ID":"ce1a061a-96b8-4f40-983f-a3d145316862","Type":"ContainerDied","Data":"cb152577c5169a23ed16be6fb5049e3abfcbdd767a3054e6e7800171de51aa13"} Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.039565 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjqzl" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.039641 4743 scope.go:117] "RemoveContainer" containerID="7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.077531 4743 scope.go:117] "RemoveContainer" containerID="c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.092725 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjqzl"] Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.104532 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjqzl"] Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.109221 4743 scope.go:117] "RemoveContainer" containerID="26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.151087 4743 scope.go:117] "RemoveContainer" containerID="7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990" Mar 10 15:45:07 crc kubenswrapper[4743]: E0310 15:45:07.151595 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990\": container with ID starting with 7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990 not found: ID does not exist" containerID="7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.151653 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990"} err="failed to get container status \"7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990\": rpc error: code = NotFound desc = could not find container \"7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990\": container with ID starting with 7067894eb36055eafa5dbda4fa0ba14de048764cb094381778477cc79f9d3990 not found: ID does not exist" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.151690 4743 scope.go:117] "RemoveContainer" containerID="c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b" Mar 10 15:45:07 crc kubenswrapper[4743]: E0310 15:45:07.152479 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b\": container with ID starting with c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b not found: ID does not exist" containerID="c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.152544 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b"} err="failed to get container status \"c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b\": rpc error: code = NotFound desc = could not find container \"c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b\": container with ID starting with c4e3c026ee79c04a1e2f79f38bcd6b2cf25464a0b1eb9f4d838ff8278641b77b not found: ID does not exist" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.152596 4743 scope.go:117] "RemoveContainer" containerID="26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17" Mar 10 15:45:07 crc kubenswrapper[4743]: E0310 15:45:07.152977 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17\": container with ID starting with 26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17 not found: ID does not exist" containerID="26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.152998 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17"} err="failed to get container status \"26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17\": rpc error: code = NotFound desc = could not find container \"26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17\": container with ID starting with 26afe6c58d1510162776a49feca5b0cfca289c93fad9d519d46710dd9a892d17 not found: ID does not exist" Mar 10 15:45:07 crc kubenswrapper[4743]: I0310 15:45:07.930116 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" path="/var/lib/kubelet/pods/ce1a061a-96b8-4f40-983f-a3d145316862/volumes" Mar 10 15:45:14 crc kubenswrapper[4743]: I0310 15:45:14.915259 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:45:14 crc kubenswrapper[4743]: E0310 15:45:14.916122 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:45:29 crc kubenswrapper[4743]: I0310 15:45:29.916868 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:45:29 crc kubenswrapper[4743]: E0310 15:45:29.918302 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:45:43 crc kubenswrapper[4743]: I0310 15:45:43.917050 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:45:43 crc kubenswrapper[4743]: E0310 15:45:43.918602 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:45:55 crc kubenswrapper[4743]: I0310 15:45:55.943736 4743 scope.go:117] "RemoveContainer" containerID="e5ca210bdab9aeadfaaa4fdaccea09d81610e74728e650ea7841c4c954e71317" Mar 10 15:45:57 crc kubenswrapper[4743]: I0310 15:45:57.915585 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:45:57 crc kubenswrapper[4743]: E0310 15:45:57.916462 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.165017 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552626-tvwx2"] Mar 10 15:46:00 crc kubenswrapper[4743]: E0310 15:46:00.166034 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054edb77-7d07-4c2b-adf6-c50909d6dc2b" containerName="collect-profiles" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.166055 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="054edb77-7d07-4c2b-adf6-c50909d6dc2b" containerName="collect-profiles" Mar 10 15:46:00 crc kubenswrapper[4743]: E0310 15:46:00.166078 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" containerName="extract-utilities" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.166086 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" containerName="extract-utilities" Mar 10 15:46:00 crc kubenswrapper[4743]: E0310 15:46:00.166102 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" containerName="extract-content" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.166111 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" containerName="extract-content" Mar 10 15:46:00 crc kubenswrapper[4743]: E0310 15:46:00.166138 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" containerName="registry-server" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.166146 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" containerName="registry-server" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.166395 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1a061a-96b8-4f40-983f-a3d145316862" containerName="registry-server" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.166430 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="054edb77-7d07-4c2b-adf6-c50909d6dc2b" containerName="collect-profiles" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.167408 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552626-tvwx2" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.171701 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.172074 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.172483 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.182048 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552626-tvwx2"] Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.332288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8skpg\" (UniqueName: \"kubernetes.io/projected/d04b5b7c-4428-4790-9c3b-ce2f467f1afa-kube-api-access-8skpg\") pod \"auto-csr-approver-29552626-tvwx2\" (UID: \"d04b5b7c-4428-4790-9c3b-ce2f467f1afa\") " pod="openshift-infra/auto-csr-approver-29552626-tvwx2" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.436200 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8skpg\" (UniqueName: \"kubernetes.io/projected/d04b5b7c-4428-4790-9c3b-ce2f467f1afa-kube-api-access-8skpg\") pod \"auto-csr-approver-29552626-tvwx2\" (UID: \"d04b5b7c-4428-4790-9c3b-ce2f467f1afa\") " pod="openshift-infra/auto-csr-approver-29552626-tvwx2" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.462130 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8skpg\" (UniqueName: \"kubernetes.io/projected/d04b5b7c-4428-4790-9c3b-ce2f467f1afa-kube-api-access-8skpg\") pod \"auto-csr-approver-29552626-tvwx2\" (UID: \"d04b5b7c-4428-4790-9c3b-ce2f467f1afa\") " pod="openshift-infra/auto-csr-approver-29552626-tvwx2" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.496595 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552626-tvwx2" Mar 10 15:46:00 crc kubenswrapper[4743]: I0310 15:46:00.848793 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552626-tvwx2"] Mar 10 15:46:01 crc kubenswrapper[4743]: I0310 15:46:01.602491 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552626-tvwx2" event={"ID":"d04b5b7c-4428-4790-9c3b-ce2f467f1afa","Type":"ContainerStarted","Data":"d3309e08b3f253a5913b99707781f574f2247cbbcf731bc1ff2e7658bc168947"} Mar 10 15:46:02 crc kubenswrapper[4743]: I0310 15:46:02.614285 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04b5b7c-4428-4790-9c3b-ce2f467f1afa" containerID="99af52d1d94d1d75d6744e3b4888d3df15e44a8c84df83577a972f3cbc720fe5" exitCode=0 Mar 10 15:46:02 crc kubenswrapper[4743]: I0310 15:46:02.614339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552626-tvwx2" event={"ID":"d04b5b7c-4428-4790-9c3b-ce2f467f1afa","Type":"ContainerDied","Data":"99af52d1d94d1d75d6744e3b4888d3df15e44a8c84df83577a972f3cbc720fe5"} Mar 10 15:46:04 crc kubenswrapper[4743]: I0310 15:46:04.094317 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552626-tvwx2" Mar 10 15:46:04 crc kubenswrapper[4743]: I0310 15:46:04.249048 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8skpg\" (UniqueName: \"kubernetes.io/projected/d04b5b7c-4428-4790-9c3b-ce2f467f1afa-kube-api-access-8skpg\") pod \"d04b5b7c-4428-4790-9c3b-ce2f467f1afa\" (UID: \"d04b5b7c-4428-4790-9c3b-ce2f467f1afa\") " Mar 10 15:46:04 crc kubenswrapper[4743]: I0310 15:46:04.255743 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04b5b7c-4428-4790-9c3b-ce2f467f1afa-kube-api-access-8skpg" (OuterVolumeSpecName: "kube-api-access-8skpg") pod "d04b5b7c-4428-4790-9c3b-ce2f467f1afa" (UID: "d04b5b7c-4428-4790-9c3b-ce2f467f1afa"). InnerVolumeSpecName "kube-api-access-8skpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:46:04 crc kubenswrapper[4743]: I0310 15:46:04.351569 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8skpg\" (UniqueName: \"kubernetes.io/projected/d04b5b7c-4428-4790-9c3b-ce2f467f1afa-kube-api-access-8skpg\") on node \"crc\" DevicePath \"\"" Mar 10 15:46:04 crc kubenswrapper[4743]: I0310 15:46:04.637162 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552626-tvwx2" event={"ID":"d04b5b7c-4428-4790-9c3b-ce2f467f1afa","Type":"ContainerDied","Data":"d3309e08b3f253a5913b99707781f574f2247cbbcf731bc1ff2e7658bc168947"} Mar 10 15:46:04 crc kubenswrapper[4743]: I0310 15:46:04.637229 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3309e08b3f253a5913b99707781f574f2247cbbcf731bc1ff2e7658bc168947" Mar 10 15:46:04 crc kubenswrapper[4743]: I0310 15:46:04.637263 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552626-tvwx2" Mar 10 15:46:05 crc kubenswrapper[4743]: I0310 15:46:05.168010 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552620-gncgx"] Mar 10 15:46:05 crc kubenswrapper[4743]: I0310 15:46:05.177695 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552620-gncgx"] Mar 10 15:46:05 crc kubenswrapper[4743]: I0310 15:46:05.927352 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acab6558-b67f-47a2-a721-59b9a7206996" path="/var/lib/kubelet/pods/acab6558-b67f-47a2-a721-59b9a7206996/volumes" Mar 10 15:46:12 crc kubenswrapper[4743]: I0310 15:46:12.915923 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:46:12 crc kubenswrapper[4743]: E0310 15:46:12.916753 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:46:23 crc kubenswrapper[4743]: I0310 15:46:23.916068 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:46:23 crc kubenswrapper[4743]: E0310 15:46:23.917185 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:46:37 crc kubenswrapper[4743]: I0310 15:46:37.915643 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:46:37 crc kubenswrapper[4743]: E0310 15:46:37.916391 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:46:50 crc kubenswrapper[4743]: I0310 15:46:50.916440 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:46:50 crc kubenswrapper[4743]: E0310 15:46:50.917462 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:46:56 crc kubenswrapper[4743]: I0310 15:46:56.033210 4743 scope.go:117] "RemoveContainer" containerID="46573d2ad2a41fe1b1d66c8c4902a281fa145c0c728caa297991294a1291a76d" Mar 10 15:46:57 crc kubenswrapper[4743]: I0310 15:46:57.412987 4743 generic.go:334] "Generic (PLEG): container finished" podID="b982c5ef-116d-4e18-a707-768c7f0fbfc0" containerID="158e1783698d99efdeaa8127e0cdc981fcfdc8da7f3830a96ab52c64f5792d7b" exitCode=0 Mar 10 15:46:57 crc kubenswrapper[4743]: I0310 15:46:57.413049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" event={"ID":"b982c5ef-116d-4e18-a707-768c7f0fbfc0","Type":"ContainerDied","Data":"158e1783698d99efdeaa8127e0cdc981fcfdc8da7f3830a96ab52c64f5792d7b"} Mar 10 15:46:58 crc kubenswrapper[4743]: I0310 15:46:58.863321 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:46:58 crc kubenswrapper[4743]: I0310 15:46:58.968948 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-combined-ca-bundle\") pod \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " Mar 10 15:46:58 crc kubenswrapper[4743]: I0310 15:46:58.969194 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-secret-0\") pod \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " Mar 10 15:46:58 crc kubenswrapper[4743]: I0310 15:46:58.969341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-ssh-key-openstack-edpm-ipam\") pod \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " Mar 10 15:46:58 crc kubenswrapper[4743]: I0310 15:46:58.969371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwbqp\" (UniqueName: \"kubernetes.io/projected/b982c5ef-116d-4e18-a707-768c7f0fbfc0-kube-api-access-lwbqp\") pod \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " Mar 10 15:46:58 crc kubenswrapper[4743]: I0310 15:46:58.969428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-inventory\") pod \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\" (UID: \"b982c5ef-116d-4e18-a707-768c7f0fbfc0\") " Mar 10 15:46:58 crc kubenswrapper[4743]: I0310 15:46:58.975417 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b982c5ef-116d-4e18-a707-768c7f0fbfc0" (UID: "b982c5ef-116d-4e18-a707-768c7f0fbfc0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:46:58 crc kubenswrapper[4743]: I0310 15:46:58.977026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b982c5ef-116d-4e18-a707-768c7f0fbfc0-kube-api-access-lwbqp" (OuterVolumeSpecName: "kube-api-access-lwbqp") pod "b982c5ef-116d-4e18-a707-768c7f0fbfc0" (UID: "b982c5ef-116d-4e18-a707-768c7f0fbfc0"). InnerVolumeSpecName "kube-api-access-lwbqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.013215 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-inventory" (OuterVolumeSpecName: "inventory") pod "b982c5ef-116d-4e18-a707-768c7f0fbfc0" (UID: "b982c5ef-116d-4e18-a707-768c7f0fbfc0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.013923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b982c5ef-116d-4e18-a707-768c7f0fbfc0" (UID: "b982c5ef-116d-4e18-a707-768c7f0fbfc0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.018588 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b982c5ef-116d-4e18-a707-768c7f0fbfc0" (UID: "b982c5ef-116d-4e18-a707-768c7f0fbfc0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.072539 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.072622 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.072697 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.072723 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwbqp\" (UniqueName: \"kubernetes.io/projected/b982c5ef-116d-4e18-a707-768c7f0fbfc0-kube-api-access-lwbqp\") on node \"crc\" DevicePath \"\"" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.072742 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b982c5ef-116d-4e18-a707-768c7f0fbfc0-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.436769 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" event={"ID":"b982c5ef-116d-4e18-a707-768c7f0fbfc0","Type":"ContainerDied","Data":"aed052d005af8b0636b499ed9cb45f532360d2b55d23f7be4fd365f99e96efa4"} Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.437108 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed052d005af8b0636b499ed9cb45f532360d2b55d23f7be4fd365f99e96efa4" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.436878 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fd24m" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.540259 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb"] Mar 10 15:46:59 crc kubenswrapper[4743]: E0310 15:46:59.540730 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04b5b7c-4428-4790-9c3b-ce2f467f1afa" containerName="oc" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.540750 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04b5b7c-4428-4790-9c3b-ce2f467f1afa" containerName="oc" Mar 10 15:46:59 crc kubenswrapper[4743]: E0310 15:46:59.540769 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b982c5ef-116d-4e18-a707-768c7f0fbfc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.540779 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b982c5ef-116d-4e18-a707-768c7f0fbfc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.541046 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04b5b7c-4428-4790-9c3b-ce2f467f1afa" containerName="oc" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.541066 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b982c5ef-116d-4e18-a707-768c7f0fbfc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.541836 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.545096 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.545155 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.545955 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.546079 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.546441 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.546446 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.553046 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.563204 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb"] Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.706630 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707033 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707080 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707131 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttdm9\" (UniqueName: \"kubernetes.io/projected/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-kube-api-access-ttdm9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707494 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.707565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.808970 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809226 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809251 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809279 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttdm9\" (UniqueName: \"kubernetes.io/projected/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-kube-api-access-ttdm9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.809427 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.810659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.815946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.815971 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.816300 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.816525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.816961 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.817001 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.817509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.817828 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.818588 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.828292 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttdm9\" (UniqueName: \"kubernetes.io/projected/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-kube-api-access-ttdm9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bmhcb\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:46:59 crc kubenswrapper[4743]: I0310 15:46:59.911803 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:47:00 crc kubenswrapper[4743]: I0310 15:47:00.481619 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb"] Mar 10 15:47:00 crc kubenswrapper[4743]: W0310 15:47:00.491055 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc93be7aa_1386_4f1b_9d49_eeb48c2e982c.slice/crio-2411bb9223048dadf494e918a96564c2dd29eb55e5e73d4cf72f90f4e6c4cd60 WatchSource:0}: Error finding container 2411bb9223048dadf494e918a96564c2dd29eb55e5e73d4cf72f90f4e6c4cd60: Status 404 returned error can't find the container with id 2411bb9223048dadf494e918a96564c2dd29eb55e5e73d4cf72f90f4e6c4cd60 Mar 10 15:47:01 crc kubenswrapper[4743]: I0310 15:47:01.456484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" event={"ID":"c93be7aa-1386-4f1b-9d49-eeb48c2e982c","Type":"ContainerStarted","Data":"2411bb9223048dadf494e918a96564c2dd29eb55e5e73d4cf72f90f4e6c4cd60"} Mar 10 15:47:02 crc kubenswrapper[4743]: I0310 15:47:02.467363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" event={"ID":"c93be7aa-1386-4f1b-9d49-eeb48c2e982c","Type":"ContainerStarted","Data":"11a8646ddc53300c47bb1270ed2af28d0283972b93a03a13a10008422c0e6431"} Mar 10 15:47:02 crc kubenswrapper[4743]: I0310 15:47:02.499182 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" podStartSLOduration=2.78991025 podStartE2EDuration="3.499160288s" podCreationTimestamp="2026-03-10 15:46:59 +0000 UTC" firstStartedPulling="2026-03-10 15:47:00.492995458 +0000 UTC m=+2485.199810206" lastFinishedPulling="2026-03-10 15:47:01.202245486 +0000 UTC m=+2485.909060244" observedRunningTime="2026-03-10 15:47:02.490972102 +0000 UTC m=+2487.197786870" watchObservedRunningTime="2026-03-10 15:47:02.499160288 +0000 UTC m=+2487.205975036" Mar 10 15:47:02 crc kubenswrapper[4743]: I0310 15:47:02.916263 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:47:02 crc kubenswrapper[4743]: E0310 15:47:02.916664 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:47:16 crc kubenswrapper[4743]: I0310 15:47:16.915709 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:47:16 crc kubenswrapper[4743]: E0310 15:47:16.916790 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:47:28 crc kubenswrapper[4743]: I0310 15:47:28.915797 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:47:28 crc kubenswrapper[4743]: E0310 15:47:28.917837 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:47:42 crc kubenswrapper[4743]: I0310 15:47:42.916406 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:47:42 crc kubenswrapper[4743]: E0310 15:47:42.917630 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:47:56 crc kubenswrapper[4743]: I0310 15:47:56.916217 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:47:56 crc kubenswrapper[4743]: E0310 15:47:56.917636 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.162273 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552628-bddh8"] Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.164093 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552628-bddh8" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.167338 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.167451 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.172328 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.173878 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552628-bddh8"] Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.243445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs4s7\" (UniqueName: \"kubernetes.io/projected/1af1be6e-8920-4180-94c1-c4c512a25cc4-kube-api-access-cs4s7\") pod \"auto-csr-approver-29552628-bddh8\" (UID: \"1af1be6e-8920-4180-94c1-c4c512a25cc4\") " pod="openshift-infra/auto-csr-approver-29552628-bddh8" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.345957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs4s7\" (UniqueName: \"kubernetes.io/projected/1af1be6e-8920-4180-94c1-c4c512a25cc4-kube-api-access-cs4s7\") pod \"auto-csr-approver-29552628-bddh8\" (UID: \"1af1be6e-8920-4180-94c1-c4c512a25cc4\") " pod="openshift-infra/auto-csr-approver-29552628-bddh8" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.373288 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs4s7\" (UniqueName: \"kubernetes.io/projected/1af1be6e-8920-4180-94c1-c4c512a25cc4-kube-api-access-cs4s7\") pod \"auto-csr-approver-29552628-bddh8\" (UID: \"1af1be6e-8920-4180-94c1-c4c512a25cc4\") " pod="openshift-infra/auto-csr-approver-29552628-bddh8" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.486597 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552628-bddh8" Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.818165 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552628-bddh8"] Mar 10 15:48:00 crc kubenswrapper[4743]: I0310 15:48:00.825009 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:48:01 crc kubenswrapper[4743]: I0310 15:48:01.085547 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552628-bddh8" event={"ID":"1af1be6e-8920-4180-94c1-c4c512a25cc4","Type":"ContainerStarted","Data":"9b2d37faee8c8fe78736849c1cdcde9a2c178afa1f35b59eeb7db3a496a160b6"} Mar 10 15:48:03 crc kubenswrapper[4743]: I0310 15:48:03.111716 4743 generic.go:334] "Generic (PLEG): container finished" podID="1af1be6e-8920-4180-94c1-c4c512a25cc4" containerID="eb181bf94b5a86035237eb06ad80ccba725e86e81bb36538cc70114154596345" exitCode=0 Mar 10 15:48:03 crc kubenswrapper[4743]: I0310 15:48:03.111838 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552628-bddh8" event={"ID":"1af1be6e-8920-4180-94c1-c4c512a25cc4","Type":"ContainerDied","Data":"eb181bf94b5a86035237eb06ad80ccba725e86e81bb36538cc70114154596345"} Mar 10 15:48:04 crc kubenswrapper[4743]: I0310 15:48:04.559311 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552628-bddh8" Mar 10 15:48:04 crc kubenswrapper[4743]: I0310 15:48:04.653660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs4s7\" (UniqueName: \"kubernetes.io/projected/1af1be6e-8920-4180-94c1-c4c512a25cc4-kube-api-access-cs4s7\") pod \"1af1be6e-8920-4180-94c1-c4c512a25cc4\" (UID: \"1af1be6e-8920-4180-94c1-c4c512a25cc4\") " Mar 10 15:48:04 crc kubenswrapper[4743]: I0310 15:48:04.660184 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af1be6e-8920-4180-94c1-c4c512a25cc4-kube-api-access-cs4s7" (OuterVolumeSpecName: "kube-api-access-cs4s7") pod "1af1be6e-8920-4180-94c1-c4c512a25cc4" (UID: "1af1be6e-8920-4180-94c1-c4c512a25cc4"). InnerVolumeSpecName "kube-api-access-cs4s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:48:04 crc kubenswrapper[4743]: I0310 15:48:04.756026 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs4s7\" (UniqueName: \"kubernetes.io/projected/1af1be6e-8920-4180-94c1-c4c512a25cc4-kube-api-access-cs4s7\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4743]: I0310 15:48:05.150370 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552628-bddh8" event={"ID":"1af1be6e-8920-4180-94c1-c4c512a25cc4","Type":"ContainerDied","Data":"9b2d37faee8c8fe78736849c1cdcde9a2c178afa1f35b59eeb7db3a496a160b6"} Mar 10 15:48:05 crc kubenswrapper[4743]: I0310 15:48:05.150418 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2d37faee8c8fe78736849c1cdcde9a2c178afa1f35b59eeb7db3a496a160b6" Mar 10 15:48:05 crc kubenswrapper[4743]: I0310 15:48:05.150486 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552628-bddh8" Mar 10 15:48:05 crc kubenswrapper[4743]: I0310 15:48:05.660207 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552622-26d76"] Mar 10 15:48:05 crc kubenswrapper[4743]: I0310 15:48:05.669481 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552622-26d76"] Mar 10 15:48:05 crc kubenswrapper[4743]: I0310 15:48:05.960166 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca96e86-ba03-4721-a048-44952f8dd42c" path="/var/lib/kubelet/pods/0ca96e86-ba03-4721-a048-44952f8dd42c/volumes" Mar 10 15:48:07 crc kubenswrapper[4743]: I0310 15:48:07.916237 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:48:07 crc kubenswrapper[4743]: E0310 15:48:07.916669 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:48:21 crc kubenswrapper[4743]: I0310 15:48:21.916012 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:48:21 crc kubenswrapper[4743]: E0310 15:48:21.916723 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.254792 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jrjg7"] Mar 10 15:48:31 crc kubenswrapper[4743]: E0310 15:48:31.257316 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af1be6e-8920-4180-94c1-c4c512a25cc4" containerName="oc" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.257456 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af1be6e-8920-4180-94c1-c4c512a25cc4" containerName="oc" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.257902 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af1be6e-8920-4180-94c1-c4c512a25cc4" containerName="oc" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.260003 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.267354 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jrjg7"] Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.346717 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwb2l\" (UniqueName: \"kubernetes.io/projected/a9b3dac1-b946-4c51-be2a-567980de8aa9-kube-api-access-kwb2l\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.346961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-utilities\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.346993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-catalog-content\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.449203 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwb2l\" (UniqueName: \"kubernetes.io/projected/a9b3dac1-b946-4c51-be2a-567980de8aa9-kube-api-access-kwb2l\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.449634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-utilities\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.449756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-catalog-content\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.450047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-utilities\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.450394 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-catalog-content\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.478367 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwb2l\" (UniqueName: \"kubernetes.io/projected/a9b3dac1-b946-4c51-be2a-567980de8aa9-kube-api-access-kwb2l\") pod \"certified-operators-jrjg7\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.592403 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:31 crc kubenswrapper[4743]: I0310 15:48:31.941027 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jrjg7"] Mar 10 15:48:32 crc kubenswrapper[4743]: I0310 15:48:32.449633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrjg7" event={"ID":"a9b3dac1-b946-4c51-be2a-567980de8aa9","Type":"ContainerStarted","Data":"14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474"} Mar 10 15:48:32 crc kubenswrapper[4743]: I0310 15:48:32.449996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrjg7" event={"ID":"a9b3dac1-b946-4c51-be2a-567980de8aa9","Type":"ContainerStarted","Data":"bdfd164988fbf360c940298c9b789c9483c5dece7acee5e4073b8a74ae52b795"} Mar 10 15:48:33 crc kubenswrapper[4743]: I0310 15:48:33.460119 4743 generic.go:334] "Generic (PLEG): container finished" podID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerID="14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474" exitCode=0 Mar 10 15:48:33 crc kubenswrapper[4743]: I0310 15:48:33.460253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrjg7" event={"ID":"a9b3dac1-b946-4c51-be2a-567980de8aa9","Type":"ContainerDied","Data":"14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474"} Mar 10 15:48:33 crc kubenswrapper[4743]: I0310 15:48:33.915201 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:48:33 crc kubenswrapper[4743]: E0310 15:48:33.915595 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:48:35 crc kubenswrapper[4743]: I0310 15:48:35.482923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrjg7" event={"ID":"a9b3dac1-b946-4c51-be2a-567980de8aa9","Type":"ContainerStarted","Data":"eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe"} Mar 10 15:48:36 crc kubenswrapper[4743]: I0310 15:48:36.497510 4743 generic.go:334] "Generic (PLEG): container finished" podID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerID="eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe" exitCode=0 Mar 10 15:48:36 crc kubenswrapper[4743]: I0310 15:48:36.497988 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrjg7" event={"ID":"a9b3dac1-b946-4c51-be2a-567980de8aa9","Type":"ContainerDied","Data":"eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe"} Mar 10 15:48:37 crc kubenswrapper[4743]: I0310 15:48:37.515270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrjg7" event={"ID":"a9b3dac1-b946-4c51-be2a-567980de8aa9","Type":"ContainerStarted","Data":"be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b"} Mar 10 15:48:37 crc kubenswrapper[4743]: I0310 15:48:37.550115 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jrjg7" podStartSLOduration=3.053662108 podStartE2EDuration="6.550094639s" podCreationTimestamp="2026-03-10 15:48:31 +0000 UTC" firstStartedPulling="2026-03-10 15:48:33.462666731 +0000 UTC m=+2578.169481499" lastFinishedPulling="2026-03-10 15:48:36.959099282 +0000 UTC m=+2581.665914030" observedRunningTime="2026-03-10 15:48:37.544227387 +0000 UTC m=+2582.251042175" watchObservedRunningTime="2026-03-10 15:48:37.550094639 +0000 UTC m=+2582.256909377" Mar 10 15:48:41 crc kubenswrapper[4743]: I0310 15:48:41.592985 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:41 crc kubenswrapper[4743]: I0310 15:48:41.593754 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:41 crc kubenswrapper[4743]: I0310 15:48:41.667057 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:42 crc kubenswrapper[4743]: I0310 15:48:42.633185 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:42 crc kubenswrapper[4743]: I0310 15:48:42.685808 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jrjg7"] Mar 10 15:48:44 crc kubenswrapper[4743]: I0310 15:48:44.577677 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jrjg7" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerName="registry-server" containerID="cri-o://be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b" gracePeriod=2 Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.061083 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.151338 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwb2l\" (UniqueName: \"kubernetes.io/projected/a9b3dac1-b946-4c51-be2a-567980de8aa9-kube-api-access-kwb2l\") pod \"a9b3dac1-b946-4c51-be2a-567980de8aa9\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.151650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-catalog-content\") pod \"a9b3dac1-b946-4c51-be2a-567980de8aa9\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.151672 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-utilities\") pod \"a9b3dac1-b946-4c51-be2a-567980de8aa9\" (UID: \"a9b3dac1-b946-4c51-be2a-567980de8aa9\") " Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.152493 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-utilities" (OuterVolumeSpecName: "utilities") pod "a9b3dac1-b946-4c51-be2a-567980de8aa9" (UID: "a9b3dac1-b946-4c51-be2a-567980de8aa9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.161414 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b3dac1-b946-4c51-be2a-567980de8aa9-kube-api-access-kwb2l" (OuterVolumeSpecName: "kube-api-access-kwb2l") pod "a9b3dac1-b946-4c51-be2a-567980de8aa9" (UID: "a9b3dac1-b946-4c51-be2a-567980de8aa9"). InnerVolumeSpecName "kube-api-access-kwb2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.209234 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9b3dac1-b946-4c51-be2a-567980de8aa9" (UID: "a9b3dac1-b946-4c51-be2a-567980de8aa9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.254189 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.254509 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b3dac1-b946-4c51-be2a-567980de8aa9-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.254617 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwb2l\" (UniqueName: \"kubernetes.io/projected/a9b3dac1-b946-4c51-be2a-567980de8aa9-kube-api-access-kwb2l\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.604175 4743 generic.go:334] "Generic (PLEG): container finished" podID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerID="be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b" exitCode=0 Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.604283 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrjg7" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.604318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrjg7" event={"ID":"a9b3dac1-b946-4c51-be2a-567980de8aa9","Type":"ContainerDied","Data":"be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b"} Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.605599 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrjg7" event={"ID":"a9b3dac1-b946-4c51-be2a-567980de8aa9","Type":"ContainerDied","Data":"bdfd164988fbf360c940298c9b789c9483c5dece7acee5e4073b8a74ae52b795"} Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.605635 4743 scope.go:117] "RemoveContainer" containerID="be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.633736 4743 scope.go:117] "RemoveContainer" containerID="eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.658667 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jrjg7"] Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.673326 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jrjg7"] Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.674087 4743 scope.go:117] "RemoveContainer" containerID="14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.706901 4743 scope.go:117] "RemoveContainer" containerID="be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b" Mar 10 15:48:45 crc kubenswrapper[4743]: E0310 15:48:45.707504 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b\": container with ID starting with be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b not found: ID does not exist" containerID="be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.707577 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b"} err="failed to get container status \"be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b\": rpc error: code = NotFound desc = could not find container \"be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b\": container with ID starting with be093c67f6b2f7a425bf1020b3e51bceb8b41004ec995201f397a272b9de9a7b not found: ID does not exist" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.707617 4743 scope.go:117] "RemoveContainer" containerID="eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe" Mar 10 15:48:45 crc kubenswrapper[4743]: E0310 15:48:45.708310 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe\": container with ID starting with eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe not found: ID does not exist" containerID="eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.708410 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe"} err="failed to get container status \"eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe\": rpc error: code = NotFound desc = could not find container \"eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe\": container with ID starting with eeb3924922b11842b5a643bebbff2f2f65042f64b8883fb4138302ce76c441fe not found: ID does not exist" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.708489 4743 scope.go:117] "RemoveContainer" containerID="14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474" Mar 10 15:48:45 crc kubenswrapper[4743]: E0310 15:48:45.709067 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474\": container with ID starting with 14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474 not found: ID does not exist" containerID="14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.709119 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474"} err="failed to get container status \"14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474\": rpc error: code = NotFound desc = could not find container \"14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474\": container with ID starting with 14190edf3dd9e7a7ddcfcf6aa590214542371d75723e0a0bf2ce731ca595b474 not found: ID does not exist" Mar 10 15:48:45 crc kubenswrapper[4743]: I0310 15:48:45.930293 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" path="/var/lib/kubelet/pods/a9b3dac1-b946-4c51-be2a-567980de8aa9/volumes" Mar 10 15:48:47 crc kubenswrapper[4743]: I0310 15:48:47.916526 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:48:47 crc kubenswrapper[4743]: E0310 15:48:47.917252 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:48:56 crc kubenswrapper[4743]: I0310 15:48:56.142907 4743 scope.go:117] "RemoveContainer" containerID="3438dd95ee26942bb61121df3ae0b6bed2c5b5648de8e37eeb9e8ea494a09e6f" Mar 10 15:49:01 crc kubenswrapper[4743]: I0310 15:49:01.916115 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:49:01 crc kubenswrapper[4743]: E0310 15:49:01.918217 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:49:14 crc kubenswrapper[4743]: I0310 15:49:14.917028 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:49:15 crc kubenswrapper[4743]: I0310 15:49:15.896056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"9e7a3799d3537217be6b939fa0048b4e6f8b92c185d54b11efba8d466671befc"} Mar 10 15:49:21 crc kubenswrapper[4743]: I0310 15:49:21.952454 4743 generic.go:334] "Generic (PLEG): container finished" podID="c93be7aa-1386-4f1b-9d49-eeb48c2e982c" containerID="11a8646ddc53300c47bb1270ed2af28d0283972b93a03a13a10008422c0e6431" exitCode=0 Mar 10 15:49:21 crc kubenswrapper[4743]: I0310 15:49:21.952503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" event={"ID":"c93be7aa-1386-4f1b-9d49-eeb48c2e982c","Type":"ContainerDied","Data":"11a8646ddc53300c47bb1270ed2af28d0283972b93a03a13a10008422c0e6431"} Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.427536 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-1\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625631 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-combined-ca-bundle\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625661 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-3\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625740 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-0\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625786 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-extra-config-0\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-0\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625921 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-1\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625949 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttdm9\" (UniqueName: \"kubernetes.io/projected/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-kube-api-access-ttdm9\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625967 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-ssh-key-openstack-edpm-ipam\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.625991 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-inventory\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.626055 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-2\") pod \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\" (UID: \"c93be7aa-1386-4f1b-9d49-eeb48c2e982c\") " Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.632295 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.632950 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-kube-api-access-ttdm9" (OuterVolumeSpecName: "kube-api-access-ttdm9") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "kube-api-access-ttdm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.655029 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.656176 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.661316 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.662126 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.663989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.668803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.680941 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-inventory" (OuterVolumeSpecName: "inventory") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.687065 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.687366 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c93be7aa-1386-4f1b-9d49-eeb48c2e982c" (UID: "c93be7aa-1386-4f1b-9d49-eeb48c2e982c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.728889 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.728932 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.728946 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.728958 4743 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.728969 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.728979 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.728990 4743 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.729000 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.729011 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.729025 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttdm9\" (UniqueName: \"kubernetes.io/projected/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-kube-api-access-ttdm9\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.729038 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c93be7aa-1386-4f1b-9d49-eeb48c2e982c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.974891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" event={"ID":"c93be7aa-1386-4f1b-9d49-eeb48c2e982c","Type":"ContainerDied","Data":"2411bb9223048dadf494e918a96564c2dd29eb55e5e73d4cf72f90f4e6c4cd60"} Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.975524 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2411bb9223048dadf494e918a96564c2dd29eb55e5e73d4cf72f90f4e6c4cd60" Mar 10 15:49:23 crc kubenswrapper[4743]: I0310 15:49:23.974974 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bmhcb" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.075031 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5"] Mar 10 15:49:24 crc kubenswrapper[4743]: E0310 15:49:24.075448 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerName="extract-utilities" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.075466 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerName="extract-utilities" Mar 10 15:49:24 crc kubenswrapper[4743]: E0310 15:49:24.075488 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93be7aa-1386-4f1b-9d49-eeb48c2e982c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.075496 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93be7aa-1386-4f1b-9d49-eeb48c2e982c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 15:49:24 crc kubenswrapper[4743]: E0310 15:49:24.075531 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerName="registry-server" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.075537 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerName="registry-server" Mar 10 15:49:24 crc kubenswrapper[4743]: E0310 15:49:24.075548 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerName="extract-content" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.075553 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerName="extract-content" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.075735 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b3dac1-b946-4c51-be2a-567980de8aa9" containerName="registry-server" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.075752 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93be7aa-1386-4f1b-9d49-eeb48c2e982c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.076441 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.078298 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.080062 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lgg74" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.080397 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.080642 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.082352 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.091672 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5"] Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.138406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.138457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.138494 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.138559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.138588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.138629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p979b\" (UniqueName: \"kubernetes.io/projected/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-kube-api-access-p979b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.138654 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.240698 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p979b\" (UniqueName: \"kubernetes.io/projected/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-kube-api-access-p979b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.240775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.240885 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.240912 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.240949 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.241002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.241031 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.259349 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.259442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.266977 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.267410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.268584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.275621 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.289635 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p979b\" (UniqueName: \"kubernetes.io/projected/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-kube-api-access-p979b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-swbc5\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.392626 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.899639 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5"] Mar 10 15:49:24 crc kubenswrapper[4743]: I0310 15:49:24.983797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" event={"ID":"afce2ed9-7b72-4bee-a5f1-689f9f6888d8","Type":"ContainerStarted","Data":"76a5ad9a98a5917fbd6d1a838bfd725dd28ec9000760ea307668adbfb4ea5c92"} Mar 10 15:49:26 crc kubenswrapper[4743]: I0310 15:49:26.023025 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" event={"ID":"afce2ed9-7b72-4bee-a5f1-689f9f6888d8","Type":"ContainerStarted","Data":"24b24da54efcf740ad7efeb5b5ed5948d9f5a93e25057d6d055efe73c36d9cd6"} Mar 10 15:49:26 crc kubenswrapper[4743]: I0310 15:49:26.048898 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" podStartSLOduration=1.469757683 podStartE2EDuration="2.048877291s" podCreationTimestamp="2026-03-10 15:49:24 +0000 UTC" firstStartedPulling="2026-03-10 15:49:24.908608611 +0000 UTC m=+2629.615423359" lastFinishedPulling="2026-03-10 15:49:25.487728199 +0000 UTC m=+2630.194542967" observedRunningTime="2026-03-10 15:49:26.042683049 +0000 UTC m=+2630.749497797" watchObservedRunningTime="2026-03-10 15:49:26.048877291 +0000 UTC m=+2630.755692039" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.160764 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552630-bldfb"] Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.163115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-bldfb" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.168549 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.169705 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.169857 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.172249 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-bldfb"] Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.244838 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdpwv\" (UniqueName: \"kubernetes.io/projected/8c505d23-bed2-4d65-864f-47bda8527537-kube-api-access-hdpwv\") pod \"auto-csr-approver-29552630-bldfb\" (UID: \"8c505d23-bed2-4d65-864f-47bda8527537\") " pod="openshift-infra/auto-csr-approver-29552630-bldfb" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.346617 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdpwv\" (UniqueName: \"kubernetes.io/projected/8c505d23-bed2-4d65-864f-47bda8527537-kube-api-access-hdpwv\") pod \"auto-csr-approver-29552630-bldfb\" (UID: \"8c505d23-bed2-4d65-864f-47bda8527537\") " pod="openshift-infra/auto-csr-approver-29552630-bldfb" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.366912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdpwv\" (UniqueName: \"kubernetes.io/projected/8c505d23-bed2-4d65-864f-47bda8527537-kube-api-access-hdpwv\") pod \"auto-csr-approver-29552630-bldfb\" (UID: \"8c505d23-bed2-4d65-864f-47bda8527537\") " pod="openshift-infra/auto-csr-approver-29552630-bldfb" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.494890 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-bldfb" Mar 10 15:50:00 crc kubenswrapper[4743]: I0310 15:50:00.977607 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-bldfb"] Mar 10 15:50:01 crc kubenswrapper[4743]: I0310 15:50:01.377333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-bldfb" event={"ID":"8c505d23-bed2-4d65-864f-47bda8527537","Type":"ContainerStarted","Data":"ac321f8ae63315a05c4059d7b6996617441fdd338b89c4b5457c0195fa5fbd64"} Mar 10 15:50:03 crc kubenswrapper[4743]: I0310 15:50:03.400157 4743 generic.go:334] "Generic (PLEG): container finished" podID="8c505d23-bed2-4d65-864f-47bda8527537" containerID="412b28d98ea25215432eaaeed307e4fdf4211734e47ee766123617d12cb9b83a" exitCode=0 Mar 10 15:50:03 crc kubenswrapper[4743]: I0310 15:50:03.400561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-bldfb" event={"ID":"8c505d23-bed2-4d65-864f-47bda8527537","Type":"ContainerDied","Data":"412b28d98ea25215432eaaeed307e4fdf4211734e47ee766123617d12cb9b83a"} Mar 10 15:50:04 crc kubenswrapper[4743]: I0310 15:50:04.813645 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-bldfb" Mar 10 15:50:04 crc kubenswrapper[4743]: I0310 15:50:04.949976 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdpwv\" (UniqueName: \"kubernetes.io/projected/8c505d23-bed2-4d65-864f-47bda8527537-kube-api-access-hdpwv\") pod \"8c505d23-bed2-4d65-864f-47bda8527537\" (UID: \"8c505d23-bed2-4d65-864f-47bda8527537\") " Mar 10 15:50:04 crc kubenswrapper[4743]: I0310 15:50:04.955271 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c505d23-bed2-4d65-864f-47bda8527537-kube-api-access-hdpwv" (OuterVolumeSpecName: "kube-api-access-hdpwv") pod "8c505d23-bed2-4d65-864f-47bda8527537" (UID: "8c505d23-bed2-4d65-864f-47bda8527537"). InnerVolumeSpecName "kube-api-access-hdpwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:50:05 crc kubenswrapper[4743]: I0310 15:50:05.053517 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdpwv\" (UniqueName: \"kubernetes.io/projected/8c505d23-bed2-4d65-864f-47bda8527537-kube-api-access-hdpwv\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:05 crc kubenswrapper[4743]: I0310 15:50:05.424747 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-bldfb" event={"ID":"8c505d23-bed2-4d65-864f-47bda8527537","Type":"ContainerDied","Data":"ac321f8ae63315a05c4059d7b6996617441fdd338b89c4b5457c0195fa5fbd64"} Mar 10 15:50:05 crc kubenswrapper[4743]: I0310 15:50:05.424789 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac321f8ae63315a05c4059d7b6996617441fdd338b89c4b5457c0195fa5fbd64" Mar 10 15:50:05 crc kubenswrapper[4743]: I0310 15:50:05.424864 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-bldfb" Mar 10 15:50:05 crc kubenswrapper[4743]: I0310 15:50:05.887505 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552624-6cthv"] Mar 10 15:50:05 crc kubenswrapper[4743]: I0310 15:50:05.897805 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552624-6cthv"] Mar 10 15:50:05 crc kubenswrapper[4743]: I0310 15:50:05.946341 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f" path="/var/lib/kubelet/pods/d75c8a77-45d5-46ca-8d3d-7f1bc0eee23f/volumes" Mar 10 15:50:56 crc kubenswrapper[4743]: I0310 15:50:56.291605 4743 scope.go:117] "RemoveContainer" containerID="b701b41e53de5bf1b014e40c38f31c42770b896c100b6d7564f7d61e9c1cae1b" Mar 10 15:51:41 crc kubenswrapper[4743]: I0310 15:51:41.253208 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:51:41 crc kubenswrapper[4743]: I0310 15:51:41.254232 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:51:41 crc kubenswrapper[4743]: I0310 15:51:41.285641 4743 generic.go:334] "Generic (PLEG): container finished" podID="afce2ed9-7b72-4bee-a5f1-689f9f6888d8" containerID="24b24da54efcf740ad7efeb5b5ed5948d9f5a93e25057d6d055efe73c36d9cd6" exitCode=0 Mar 10 15:51:41 crc kubenswrapper[4743]: I0310 15:51:41.285693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" event={"ID":"afce2ed9-7b72-4bee-a5f1-689f9f6888d8","Type":"ContainerDied","Data":"24b24da54efcf740ad7efeb5b5ed5948d9f5a93e25057d6d055efe73c36d9cd6"} Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.794325 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.904461 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-telemetry-combined-ca-bundle\") pod \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.904555 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-1\") pod \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.904628 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ssh-key-openstack-edpm-ipam\") pod \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.904766 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-2\") pod \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.904853 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-inventory\") pod \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.904891 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p979b\" (UniqueName: \"kubernetes.io/projected/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-kube-api-access-p979b\") pod \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.905085 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-0\") pod \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\" (UID: \"afce2ed9-7b72-4bee-a5f1-689f9f6888d8\") " Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.911192 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-kube-api-access-p979b" (OuterVolumeSpecName: "kube-api-access-p979b") pod "afce2ed9-7b72-4bee-a5f1-689f9f6888d8" (UID: "afce2ed9-7b72-4bee-a5f1-689f9f6888d8"). InnerVolumeSpecName "kube-api-access-p979b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.914682 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "afce2ed9-7b72-4bee-a5f1-689f9f6888d8" (UID: "afce2ed9-7b72-4bee-a5f1-689f9f6888d8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.941385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "afce2ed9-7b72-4bee-a5f1-689f9f6888d8" (UID: "afce2ed9-7b72-4bee-a5f1-689f9f6888d8"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.943551 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "afce2ed9-7b72-4bee-a5f1-689f9f6888d8" (UID: "afce2ed9-7b72-4bee-a5f1-689f9f6888d8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.944251 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "afce2ed9-7b72-4bee-a5f1-689f9f6888d8" (UID: "afce2ed9-7b72-4bee-a5f1-689f9f6888d8"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.944621 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-inventory" (OuterVolumeSpecName: "inventory") pod "afce2ed9-7b72-4bee-a5f1-689f9f6888d8" (UID: "afce2ed9-7b72-4bee-a5f1-689f9f6888d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:51:42 crc kubenswrapper[4743]: I0310 15:51:42.966695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "afce2ed9-7b72-4bee-a5f1-689f9f6888d8" (UID: "afce2ed9-7b72-4bee-a5f1-689f9f6888d8"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.007907 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.007964 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.007980 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.007993 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.008008 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.008023 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.008040 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p979b\" (UniqueName: \"kubernetes.io/projected/afce2ed9-7b72-4bee-a5f1-689f9f6888d8-kube-api-access-p979b\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.318887 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" event={"ID":"afce2ed9-7b72-4bee-a5f1-689f9f6888d8","Type":"ContainerDied","Data":"76a5ad9a98a5917fbd6d1a838bfd725dd28ec9000760ea307668adbfb4ea5c92"} Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.318932 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76a5ad9a98a5917fbd6d1a838bfd725dd28ec9000760ea307668adbfb4ea5c92" Mar 10 15:51:43 crc kubenswrapper[4743]: I0310 15:51:43.319010 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-swbc5" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.163863 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552632-sz2wx"] Mar 10 15:52:00 crc kubenswrapper[4743]: E0310 15:52:00.165953 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c505d23-bed2-4d65-864f-47bda8527537" containerName="oc" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.165979 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c505d23-bed2-4d65-864f-47bda8527537" containerName="oc" Mar 10 15:52:00 crc kubenswrapper[4743]: E0310 15:52:00.166043 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afce2ed9-7b72-4bee-a5f1-689f9f6888d8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.166054 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="afce2ed9-7b72-4bee-a5f1-689f9f6888d8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.166361 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="afce2ed9-7b72-4bee-a5f1-689f9f6888d8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.166390 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c505d23-bed2-4d65-864f-47bda8527537" containerName="oc" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.167377 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.170513 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.170730 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.172305 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.174211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-sz2wx"] Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.262876 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xx4j\" (UniqueName: \"kubernetes.io/projected/18725857-3c76-4bc6-8dd5-acd071f2d26a-kube-api-access-6xx4j\") pod \"auto-csr-approver-29552632-sz2wx\" (UID: \"18725857-3c76-4bc6-8dd5-acd071f2d26a\") " pod="openshift-infra/auto-csr-approver-29552632-sz2wx" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.364891 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xx4j\" (UniqueName: \"kubernetes.io/projected/18725857-3c76-4bc6-8dd5-acd071f2d26a-kube-api-access-6xx4j\") pod \"auto-csr-approver-29552632-sz2wx\" (UID: \"18725857-3c76-4bc6-8dd5-acd071f2d26a\") " pod="openshift-infra/auto-csr-approver-29552632-sz2wx" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.407774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xx4j\" (UniqueName: \"kubernetes.io/projected/18725857-3c76-4bc6-8dd5-acd071f2d26a-kube-api-access-6xx4j\") pod \"auto-csr-approver-29552632-sz2wx\" (UID: \"18725857-3c76-4bc6-8dd5-acd071f2d26a\") " pod="openshift-infra/auto-csr-approver-29552632-sz2wx" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.491369 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" Mar 10 15:52:00 crc kubenswrapper[4743]: I0310 15:52:00.992627 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-sz2wx"] Mar 10 15:52:01 crc kubenswrapper[4743]: I0310 15:52:01.483899 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" event={"ID":"18725857-3c76-4bc6-8dd5-acd071f2d26a","Type":"ContainerStarted","Data":"3020d580dad836ea6e0e1cd968c3107a966c1d2f973f3f113d09fa9dd08e5d8b"} Mar 10 15:52:02 crc kubenswrapper[4743]: I0310 15:52:02.493664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" event={"ID":"18725857-3c76-4bc6-8dd5-acd071f2d26a","Type":"ContainerStarted","Data":"4458ada4c24becd9201554722d5265b1ff6689ed5fcba3d124da9e841a3f9a50"} Mar 10 15:52:02 crc kubenswrapper[4743]: I0310 15:52:02.518444 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" podStartSLOduration=1.532825042 podStartE2EDuration="2.518424474s" podCreationTimestamp="2026-03-10 15:52:00 +0000 UTC" firstStartedPulling="2026-03-10 15:52:00.98715017 +0000 UTC m=+2785.693964918" lastFinishedPulling="2026-03-10 15:52:01.972749602 +0000 UTC m=+2786.679564350" observedRunningTime="2026-03-10 15:52:02.510638339 +0000 UTC m=+2787.217453087" watchObservedRunningTime="2026-03-10 15:52:02.518424474 +0000 UTC m=+2787.225239222" Mar 10 15:52:03 crc kubenswrapper[4743]: I0310 15:52:03.505058 4743 generic.go:334] "Generic (PLEG): container finished" podID="18725857-3c76-4bc6-8dd5-acd071f2d26a" containerID="4458ada4c24becd9201554722d5265b1ff6689ed5fcba3d124da9e841a3f9a50" exitCode=0 Mar 10 15:52:03 crc kubenswrapper[4743]: I0310 15:52:03.505156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" event={"ID":"18725857-3c76-4bc6-8dd5-acd071f2d26a","Type":"ContainerDied","Data":"4458ada4c24becd9201554722d5265b1ff6689ed5fcba3d124da9e841a3f9a50"} Mar 10 15:52:04 crc kubenswrapper[4743]: I0310 15:52:04.921180 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.066098 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xx4j\" (UniqueName: \"kubernetes.io/projected/18725857-3c76-4bc6-8dd5-acd071f2d26a-kube-api-access-6xx4j\") pod \"18725857-3c76-4bc6-8dd5-acd071f2d26a\" (UID: \"18725857-3c76-4bc6-8dd5-acd071f2d26a\") " Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.074534 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18725857-3c76-4bc6-8dd5-acd071f2d26a-kube-api-access-6xx4j" (OuterVolumeSpecName: "kube-api-access-6xx4j") pod "18725857-3c76-4bc6-8dd5-acd071f2d26a" (UID: "18725857-3c76-4bc6-8dd5-acd071f2d26a"). InnerVolumeSpecName "kube-api-access-6xx4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.168367 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xx4j\" (UniqueName: \"kubernetes.io/projected/18725857-3c76-4bc6-8dd5-acd071f2d26a-kube-api-access-6xx4j\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.523113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" event={"ID":"18725857-3c76-4bc6-8dd5-acd071f2d26a","Type":"ContainerDied","Data":"3020d580dad836ea6e0e1cd968c3107a966c1d2f973f3f113d09fa9dd08e5d8b"} Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.523157 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3020d580dad836ea6e0e1cd968c3107a966c1d2f973f3f113d09fa9dd08e5d8b" Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.523223 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-sz2wx" Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.605168 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552626-tvwx2"] Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.613918 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552626-tvwx2"] Mar 10 15:52:05 crc kubenswrapper[4743]: I0310 15:52:05.927379 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04b5b7c-4428-4790-9c3b-ce2f467f1afa" path="/var/lib/kubelet/pods/d04b5b7c-4428-4790-9c3b-ce2f467f1afa/volumes" Mar 10 15:52:11 crc kubenswrapper[4743]: I0310 15:52:11.252411 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:52:11 crc kubenswrapper[4743]: I0310 15:52:11.253016 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.252326 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.252977 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.253034 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.253948 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e7a3799d3537217be6b939fa0048b4e6f8b92c185d54b11efba8d466671befc"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.254004 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://9e7a3799d3537217be6b939fa0048b4e6f8b92c185d54b11efba8d466671befc" gracePeriod=600 Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.891253 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="9e7a3799d3537217be6b939fa0048b4e6f8b92c185d54b11efba8d466671befc" exitCode=0 Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.891318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"9e7a3799d3537217be6b939fa0048b4e6f8b92c185d54b11efba8d466671befc"} Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.891966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e"} Mar 10 15:52:41 crc kubenswrapper[4743]: I0310 15:52:41.891994 4743 scope.go:117] "RemoveContainer" containerID="707b83b41fad29c8efb0a7fa26580830bd1b87228a74775ef9e029767ae7788e" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.309935 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 15:52:44 crc kubenswrapper[4743]: E0310 15:52:44.310885 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18725857-3c76-4bc6-8dd5-acd071f2d26a" containerName="oc" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.310901 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="18725857-3c76-4bc6-8dd5-acd071f2d26a" containerName="oc" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.311129 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="18725857-3c76-4bc6-8dd5-acd071f2d26a" containerName="oc" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.311799 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.313915 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.313941 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.314479 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.322736 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.430989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.431040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbxg2\" (UniqueName: \"kubernetes.io/projected/fa680413-f368-421d-914c-1941e02c2c57-kube-api-access-jbxg2\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.431234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.431372 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.431674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.431732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-config-data\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.431783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.431885 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.431951 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.534113 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.534162 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbxg2\" (UniqueName: \"kubernetes.io/projected/fa680413-f368-421d-914c-1941e02c2c57-kube-api-access-jbxg2\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.534236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.534287 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.534842 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.534972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-config-data\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.535069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.535289 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.536187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.535376 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.536350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.536451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.536530 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.537634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-config-data\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.541235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.548255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.550628 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.561843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbxg2\" (UniqueName: \"kubernetes.io/projected/fa680413-f368-421d-914c-1941e02c2c57-kube-api-access-jbxg2\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.579703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " pod="openstack/tempest-tests-tempest" Mar 10 15:52:44 crc kubenswrapper[4743]: I0310 15:52:44.634353 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 15:52:45 crc kubenswrapper[4743]: I0310 15:52:45.095201 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 15:52:45 crc kubenswrapper[4743]: W0310 15:52:45.103651 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa680413_f368_421d_914c_1941e02c2c57.slice/crio-324c5203973df81b8a32781238e5d83fa4f17ccb03d1a5b91707bd19fe5a04f2 WatchSource:0}: Error finding container 324c5203973df81b8a32781238e5d83fa4f17ccb03d1a5b91707bd19fe5a04f2: Status 404 returned error can't find the container with id 324c5203973df81b8a32781238e5d83fa4f17ccb03d1a5b91707bd19fe5a04f2 Mar 10 15:52:45 crc kubenswrapper[4743]: I0310 15:52:45.951987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fa680413-f368-421d-914c-1941e02c2c57","Type":"ContainerStarted","Data":"324c5203973df81b8a32781238e5d83fa4f17ccb03d1a5b91707bd19fe5a04f2"} Mar 10 15:52:56 crc kubenswrapper[4743]: I0310 15:52:56.405880 4743 scope.go:117] "RemoveContainer" containerID="99af52d1d94d1d75d6744e3b4888d3df15e44a8c84df83577a972f3cbc720fe5" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.430499 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hh9pz"] Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.441068 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.453510 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hh9pz"] Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.493048 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-catalog-content\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.493475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-utilities\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.493735 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdv2\" (UniqueName: \"kubernetes.io/projected/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-kube-api-access-ccdv2\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.595555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-utilities\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.595654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccdv2\" (UniqueName: \"kubernetes.io/projected/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-kube-api-access-ccdv2\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.595747 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-catalog-content\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.596320 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-catalog-content\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.596555 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-utilities\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.618333 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccdv2\" (UniqueName: \"kubernetes.io/projected/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-kube-api-access-ccdv2\") pod \"redhat-operators-hh9pz\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:12 crc kubenswrapper[4743]: I0310 15:53:12.776163 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:14 crc kubenswrapper[4743]: E0310 15:53:14.709325 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 10 15:53:14 crc kubenswrapper[4743]: E0310 15:53:14.709861 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbxg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(fa680413-f368-421d-914c-1941e02c2c57): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:53:14 crc kubenswrapper[4743]: E0310 15:53:14.711039 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="fa680413-f368-421d-914c-1941e02c2c57" Mar 10 15:53:15 crc kubenswrapper[4743]: I0310 15:53:15.143614 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hh9pz"] Mar 10 15:53:15 crc kubenswrapper[4743]: I0310 15:53:15.253151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh9pz" event={"ID":"367d82ba-6b9d-4562-8be9-0d68ec81ebf8","Type":"ContainerStarted","Data":"d685cd436adb9def05b0a88cdc3e0a5ad421c493f61c4068fbc8bfe0cabb5824"} Mar 10 15:53:15 crc kubenswrapper[4743]: E0310 15:53:15.255044 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="fa680413-f368-421d-914c-1941e02c2c57" Mar 10 15:53:16 crc kubenswrapper[4743]: I0310 15:53:16.270479 4743 generic.go:334] "Generic (PLEG): container finished" podID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerID="7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08" exitCode=0 Mar 10 15:53:16 crc kubenswrapper[4743]: I0310 15:53:16.270586 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh9pz" event={"ID":"367d82ba-6b9d-4562-8be9-0d68ec81ebf8","Type":"ContainerDied","Data":"7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08"} Mar 10 15:53:16 crc kubenswrapper[4743]: I0310 15:53:16.275427 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:53:18 crc kubenswrapper[4743]: I0310 15:53:18.289877 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh9pz" event={"ID":"367d82ba-6b9d-4562-8be9-0d68ec81ebf8","Type":"ContainerStarted","Data":"ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18"} Mar 10 15:53:25 crc kubenswrapper[4743]: I0310 15:53:25.357149 4743 generic.go:334] "Generic (PLEG): container finished" podID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerID="ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18" exitCode=0 Mar 10 15:53:25 crc kubenswrapper[4743]: I0310 15:53:25.357241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh9pz" event={"ID":"367d82ba-6b9d-4562-8be9-0d68ec81ebf8","Type":"ContainerDied","Data":"ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18"} Mar 10 15:53:26 crc kubenswrapper[4743]: I0310 15:53:26.369229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh9pz" event={"ID":"367d82ba-6b9d-4562-8be9-0d68ec81ebf8","Type":"ContainerStarted","Data":"2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0"} Mar 10 15:53:26 crc kubenswrapper[4743]: I0310 15:53:26.389098 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hh9pz" podStartSLOduration=4.805803207 podStartE2EDuration="14.389076367s" podCreationTimestamp="2026-03-10 15:53:12 +0000 UTC" firstStartedPulling="2026-03-10 15:53:16.275034205 +0000 UTC m=+2860.981848983" lastFinishedPulling="2026-03-10 15:53:25.858307395 +0000 UTC m=+2870.565122143" observedRunningTime="2026-03-10 15:53:26.388292695 +0000 UTC m=+2871.095107463" watchObservedRunningTime="2026-03-10 15:53:26.389076367 +0000 UTC m=+2871.095891115" Mar 10 15:53:30 crc kubenswrapper[4743]: I0310 15:53:30.422735 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fa680413-f368-421d-914c-1941e02c2c57","Type":"ContainerStarted","Data":"44dcc03ade38ccd23aaa0bde7d1b97d53489b3aea5d6eadc5f26a931bf4e0a97"} Mar 10 15:53:30 crc kubenswrapper[4743]: I0310 15:53:30.444095 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.077771285 podStartE2EDuration="47.444072574s" podCreationTimestamp="2026-03-10 15:52:43 +0000 UTC" firstStartedPulling="2026-03-10 15:52:45.108700029 +0000 UTC m=+2829.815514777" lastFinishedPulling="2026-03-10 15:53:29.475001308 +0000 UTC m=+2874.181816066" observedRunningTime="2026-03-10 15:53:30.443052205 +0000 UTC m=+2875.149866953" watchObservedRunningTime="2026-03-10 15:53:30.444072574 +0000 UTC m=+2875.150887332" Mar 10 15:53:32 crc kubenswrapper[4743]: I0310 15:53:32.777156 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:32 crc kubenswrapper[4743]: I0310 15:53:32.777832 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:32 crc kubenswrapper[4743]: I0310 15:53:32.844745 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:33 crc kubenswrapper[4743]: I0310 15:53:33.496970 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:33 crc kubenswrapper[4743]: I0310 15:53:33.550155 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hh9pz"] Mar 10 15:53:35 crc kubenswrapper[4743]: I0310 15:53:35.468731 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hh9pz" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerName="registry-server" containerID="cri-o://2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0" gracePeriod=2 Mar 10 15:53:35 crc kubenswrapper[4743]: I0310 15:53:35.986660 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.105392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccdv2\" (UniqueName: \"kubernetes.io/projected/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-kube-api-access-ccdv2\") pod \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.105570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-utilities\") pod \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.105681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-catalog-content\") pod \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\" (UID: \"367d82ba-6b9d-4562-8be9-0d68ec81ebf8\") " Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.106536 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-utilities" (OuterVolumeSpecName: "utilities") pod "367d82ba-6b9d-4562-8be9-0d68ec81ebf8" (UID: "367d82ba-6b9d-4562-8be9-0d68ec81ebf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.112131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-kube-api-access-ccdv2" (OuterVolumeSpecName: "kube-api-access-ccdv2") pod "367d82ba-6b9d-4562-8be9-0d68ec81ebf8" (UID: "367d82ba-6b9d-4562-8be9-0d68ec81ebf8"). InnerVolumeSpecName "kube-api-access-ccdv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.208539 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccdv2\" (UniqueName: \"kubernetes.io/projected/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-kube-api-access-ccdv2\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.208570 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.255588 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "367d82ba-6b9d-4562-8be9-0d68ec81ebf8" (UID: "367d82ba-6b9d-4562-8be9-0d68ec81ebf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.310010 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d82ba-6b9d-4562-8be9-0d68ec81ebf8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.505201 4743 generic.go:334] "Generic (PLEG): container finished" podID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerID="2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0" exitCode=0 Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.505360 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hh9pz" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.505427 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh9pz" event={"ID":"367d82ba-6b9d-4562-8be9-0d68ec81ebf8","Type":"ContainerDied","Data":"2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0"} Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.506892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh9pz" event={"ID":"367d82ba-6b9d-4562-8be9-0d68ec81ebf8","Type":"ContainerDied","Data":"d685cd436adb9def05b0a88cdc3e0a5ad421c493f61c4068fbc8bfe0cabb5824"} Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.506946 4743 scope.go:117] "RemoveContainer" containerID="2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.541155 4743 scope.go:117] "RemoveContainer" containerID="ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.551960 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hh9pz"] Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.560669 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hh9pz"] Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.566767 4743 scope.go:117] "RemoveContainer" containerID="7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.613314 4743 scope.go:117] "RemoveContainer" containerID="2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0" Mar 10 15:53:36 crc kubenswrapper[4743]: E0310 15:53:36.614230 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0\": container with ID starting with 2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0 not found: ID does not exist" containerID="2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.614284 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0"} err="failed to get container status \"2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0\": rpc error: code = NotFound desc = could not find container \"2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0\": container with ID starting with 2faf5aaa8b19b4c06c8d07be36c874bc4b3655eff94964ad4ef30f4d3ab63ef0 not found: ID does not exist" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.614318 4743 scope.go:117] "RemoveContainer" containerID="ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18" Mar 10 15:53:36 crc kubenswrapper[4743]: E0310 15:53:36.615117 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18\": container with ID starting with ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18 not found: ID does not exist" containerID="ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.615145 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18"} err="failed to get container status \"ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18\": rpc error: code = NotFound desc = could not find container \"ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18\": container with ID starting with ec019f57aac1ac70023c89948855206908f785cf90d70b11b046181d616b0a18 not found: ID does not exist" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.615162 4743 scope.go:117] "RemoveContainer" containerID="7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08" Mar 10 15:53:36 crc kubenswrapper[4743]: E0310 15:53:36.616757 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08\": container with ID starting with 7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08 not found: ID does not exist" containerID="7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08" Mar 10 15:53:36 crc kubenswrapper[4743]: I0310 15:53:36.616906 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08"} err="failed to get container status \"7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08\": rpc error: code = NotFound desc = could not find container \"7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08\": container with ID starting with 7e6b233c409b1b6f4dedb671505af5dde81ad9d63c3bd3b2e2af8ff997b33e08 not found: ID does not exist" Mar 10 15:53:37 crc kubenswrapper[4743]: I0310 15:53:37.936521 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" path="/var/lib/kubelet/pods/367d82ba-6b9d-4562-8be9-0d68ec81ebf8/volumes" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.785680 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5g4g4"] Mar 10 15:53:52 crc kubenswrapper[4743]: E0310 15:53:52.786967 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerName="extract-content" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.786984 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerName="extract-content" Mar 10 15:53:52 crc kubenswrapper[4743]: E0310 15:53:52.786996 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerName="extract-utilities" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.787004 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerName="extract-utilities" Mar 10 15:53:52 crc kubenswrapper[4743]: E0310 15:53:52.787030 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerName="registry-server" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.787037 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerName="registry-server" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.787211 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="367d82ba-6b9d-4562-8be9-0d68ec81ebf8" containerName="registry-server" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.788803 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.803373 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5g4g4"] Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.808713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-catalog-content\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.809026 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmk9x\" (UniqueName: \"kubernetes.io/projected/313057e3-6498-45e3-a93f-bcba978656f6-kube-api-access-fmk9x\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.809150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-utilities\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.911786 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-catalog-content\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.912054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmk9x\" (UniqueName: \"kubernetes.io/projected/313057e3-6498-45e3-a93f-bcba978656f6-kube-api-access-fmk9x\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.912085 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-utilities\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.912292 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-catalog-content\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.912648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-utilities\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:52 crc kubenswrapper[4743]: I0310 15:53:52.940074 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmk9x\" (UniqueName: \"kubernetes.io/projected/313057e3-6498-45e3-a93f-bcba978656f6-kube-api-access-fmk9x\") pod \"community-operators-5g4g4\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:53 crc kubenswrapper[4743]: I0310 15:53:53.167185 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:53:53 crc kubenswrapper[4743]: I0310 15:53:53.766596 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5g4g4"] Mar 10 15:53:54 crc kubenswrapper[4743]: I0310 15:53:54.684039 4743 generic.go:334] "Generic (PLEG): container finished" podID="313057e3-6498-45e3-a93f-bcba978656f6" containerID="bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b" exitCode=0 Mar 10 15:53:54 crc kubenswrapper[4743]: I0310 15:53:54.684308 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g4g4" event={"ID":"313057e3-6498-45e3-a93f-bcba978656f6","Type":"ContainerDied","Data":"bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b"} Mar 10 15:53:54 crc kubenswrapper[4743]: I0310 15:53:54.684336 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g4g4" event={"ID":"313057e3-6498-45e3-a93f-bcba978656f6","Type":"ContainerStarted","Data":"b3cda92f617897ab3b3e1e3142b200b05e86c47ca070b60b6db9c4a88c129331"} Mar 10 15:53:55 crc kubenswrapper[4743]: I0310 15:53:55.704046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g4g4" event={"ID":"313057e3-6498-45e3-a93f-bcba978656f6","Type":"ContainerStarted","Data":"17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68"} Mar 10 15:53:57 crc kubenswrapper[4743]: I0310 15:53:57.727015 4743 generic.go:334] "Generic (PLEG): container finished" podID="313057e3-6498-45e3-a93f-bcba978656f6" containerID="17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68" exitCode=0 Mar 10 15:53:57 crc kubenswrapper[4743]: I0310 15:53:57.727079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g4g4" event={"ID":"313057e3-6498-45e3-a93f-bcba978656f6","Type":"ContainerDied","Data":"17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68"} Mar 10 15:53:58 crc kubenswrapper[4743]: I0310 15:53:58.744323 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g4g4" event={"ID":"313057e3-6498-45e3-a93f-bcba978656f6","Type":"ContainerStarted","Data":"b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c"} Mar 10 15:53:58 crc kubenswrapper[4743]: I0310 15:53:58.767406 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5g4g4" podStartSLOduration=3.226542681 podStartE2EDuration="6.767387017s" podCreationTimestamp="2026-03-10 15:53:52 +0000 UTC" firstStartedPulling="2026-03-10 15:53:54.686557135 +0000 UTC m=+2899.393371883" lastFinishedPulling="2026-03-10 15:53:58.227401461 +0000 UTC m=+2902.934216219" observedRunningTime="2026-03-10 15:53:58.767225242 +0000 UTC m=+2903.474040010" watchObservedRunningTime="2026-03-10 15:53:58.767387017 +0000 UTC m=+2903.474201765" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.144433 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552634-nvb5j"] Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.146284 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-nvb5j" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.148858 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.149088 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.149317 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.163121 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-nvb5j"] Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.280652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tg7q\" (UniqueName: \"kubernetes.io/projected/fa770b16-0f11-44ce-9565-67292e1e62ca-kube-api-access-7tg7q\") pod \"auto-csr-approver-29552634-nvb5j\" (UID: \"fa770b16-0f11-44ce-9565-67292e1e62ca\") " pod="openshift-infra/auto-csr-approver-29552634-nvb5j" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.384267 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tg7q\" (UniqueName: \"kubernetes.io/projected/fa770b16-0f11-44ce-9565-67292e1e62ca-kube-api-access-7tg7q\") pod \"auto-csr-approver-29552634-nvb5j\" (UID: \"fa770b16-0f11-44ce-9565-67292e1e62ca\") " pod="openshift-infra/auto-csr-approver-29552634-nvb5j" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.417396 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tg7q\" (UniqueName: \"kubernetes.io/projected/fa770b16-0f11-44ce-9565-67292e1e62ca-kube-api-access-7tg7q\") pod \"auto-csr-approver-29552634-nvb5j\" (UID: \"fa770b16-0f11-44ce-9565-67292e1e62ca\") " pod="openshift-infra/auto-csr-approver-29552634-nvb5j" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.473141 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-nvb5j" Mar 10 15:54:00 crc kubenswrapper[4743]: I0310 15:54:00.950854 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-nvb5j"] Mar 10 15:54:00 crc kubenswrapper[4743]: W0310 15:54:00.955201 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa770b16_0f11_44ce_9565_67292e1e62ca.slice/crio-c38dcc718cdf000c2e3a5a9e01e33e19c3b77a855cc646cbe9d3ef7bf65f68d3 WatchSource:0}: Error finding container c38dcc718cdf000c2e3a5a9e01e33e19c3b77a855cc646cbe9d3ef7bf65f68d3: Status 404 returned error can't find the container with id c38dcc718cdf000c2e3a5a9e01e33e19c3b77a855cc646cbe9d3ef7bf65f68d3 Mar 10 15:54:01 crc kubenswrapper[4743]: I0310 15:54:01.788348 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-nvb5j" event={"ID":"fa770b16-0f11-44ce-9565-67292e1e62ca","Type":"ContainerStarted","Data":"c38dcc718cdf000c2e3a5a9e01e33e19c3b77a855cc646cbe9d3ef7bf65f68d3"} Mar 10 15:54:02 crc kubenswrapper[4743]: I0310 15:54:02.801953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-nvb5j" event={"ID":"fa770b16-0f11-44ce-9565-67292e1e62ca","Type":"ContainerDied","Data":"526b2371123323292f8acb871aded42430652b5d046b75edb945c7aaeacd976b"} Mar 10 15:54:02 crc kubenswrapper[4743]: I0310 15:54:02.801960 4743 generic.go:334] "Generic (PLEG): container finished" podID="fa770b16-0f11-44ce-9565-67292e1e62ca" containerID="526b2371123323292f8acb871aded42430652b5d046b75edb945c7aaeacd976b" exitCode=0 Mar 10 15:54:03 crc kubenswrapper[4743]: I0310 15:54:03.168264 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:54:03 crc kubenswrapper[4743]: I0310 15:54:03.168372 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:54:04 crc kubenswrapper[4743]: I0310 15:54:04.215144 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-nvb5j" Mar 10 15:54:04 crc kubenswrapper[4743]: I0310 15:54:04.219119 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5g4g4" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="registry-server" probeResult="failure" output=< Mar 10 15:54:04 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 15:54:04 crc kubenswrapper[4743]: > Mar 10 15:54:04 crc kubenswrapper[4743]: I0310 15:54:04.372637 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tg7q\" (UniqueName: \"kubernetes.io/projected/fa770b16-0f11-44ce-9565-67292e1e62ca-kube-api-access-7tg7q\") pod \"fa770b16-0f11-44ce-9565-67292e1e62ca\" (UID: \"fa770b16-0f11-44ce-9565-67292e1e62ca\") " Mar 10 15:54:04 crc kubenswrapper[4743]: I0310 15:54:04.387621 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa770b16-0f11-44ce-9565-67292e1e62ca-kube-api-access-7tg7q" (OuterVolumeSpecName: "kube-api-access-7tg7q") pod "fa770b16-0f11-44ce-9565-67292e1e62ca" (UID: "fa770b16-0f11-44ce-9565-67292e1e62ca"). InnerVolumeSpecName "kube-api-access-7tg7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:04 crc kubenswrapper[4743]: I0310 15:54:04.476092 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tg7q\" (UniqueName: \"kubernetes.io/projected/fa770b16-0f11-44ce-9565-67292e1e62ca-kube-api-access-7tg7q\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:04 crc kubenswrapper[4743]: I0310 15:54:04.822452 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-nvb5j" event={"ID":"fa770b16-0f11-44ce-9565-67292e1e62ca","Type":"ContainerDied","Data":"c38dcc718cdf000c2e3a5a9e01e33e19c3b77a855cc646cbe9d3ef7bf65f68d3"} Mar 10 15:54:04 crc kubenswrapper[4743]: I0310 15:54:04.822495 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c38dcc718cdf000c2e3a5a9e01e33e19c3b77a855cc646cbe9d3ef7bf65f68d3" Mar 10 15:54:04 crc kubenswrapper[4743]: I0310 15:54:04.822550 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-nvb5j" Mar 10 15:54:05 crc kubenswrapper[4743]: I0310 15:54:05.286345 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552628-bddh8"] Mar 10 15:54:05 crc kubenswrapper[4743]: I0310 15:54:05.299200 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552628-bddh8"] Mar 10 15:54:05 crc kubenswrapper[4743]: I0310 15:54:05.928622 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af1be6e-8920-4180-94c1-c4c512a25cc4" path="/var/lib/kubelet/pods/1af1be6e-8920-4180-94c1-c4c512a25cc4/volumes" Mar 10 15:54:13 crc kubenswrapper[4743]: I0310 15:54:13.218790 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:54:13 crc kubenswrapper[4743]: I0310 15:54:13.284967 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:54:13 crc kubenswrapper[4743]: I0310 15:54:13.456526 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5g4g4"] Mar 10 15:54:14 crc kubenswrapper[4743]: I0310 15:54:14.778001 4743 scope.go:117] "RemoveContainer" containerID="eb181bf94b5a86035237eb06ad80ccba725e86e81bb36538cc70114154596345" Mar 10 15:54:14 crc kubenswrapper[4743]: I0310 15:54:14.941862 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5g4g4" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="registry-server" containerID="cri-o://b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c" gracePeriod=2 Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.520280 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.601675 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmk9x\" (UniqueName: \"kubernetes.io/projected/313057e3-6498-45e3-a93f-bcba978656f6-kube-api-access-fmk9x\") pod \"313057e3-6498-45e3-a93f-bcba978656f6\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.601855 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-catalog-content\") pod \"313057e3-6498-45e3-a93f-bcba978656f6\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.601886 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-utilities\") pod \"313057e3-6498-45e3-a93f-bcba978656f6\" (UID: \"313057e3-6498-45e3-a93f-bcba978656f6\") " Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.603093 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-utilities" (OuterVolumeSpecName: "utilities") pod "313057e3-6498-45e3-a93f-bcba978656f6" (UID: "313057e3-6498-45e3-a93f-bcba978656f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.609629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313057e3-6498-45e3-a93f-bcba978656f6-kube-api-access-fmk9x" (OuterVolumeSpecName: "kube-api-access-fmk9x") pod "313057e3-6498-45e3-a93f-bcba978656f6" (UID: "313057e3-6498-45e3-a93f-bcba978656f6"). InnerVolumeSpecName "kube-api-access-fmk9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.659097 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "313057e3-6498-45e3-a93f-bcba978656f6" (UID: "313057e3-6498-45e3-a93f-bcba978656f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.704795 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.704852 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmk9x\" (UniqueName: \"kubernetes.io/projected/313057e3-6498-45e3-a93f-bcba978656f6-kube-api-access-fmk9x\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.704863 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313057e3-6498-45e3-a93f-bcba978656f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.952320 4743 generic.go:334] "Generic (PLEG): container finished" podID="313057e3-6498-45e3-a93f-bcba978656f6" containerID="b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c" exitCode=0 Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.952382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g4g4" event={"ID":"313057e3-6498-45e3-a93f-bcba978656f6","Type":"ContainerDied","Data":"b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c"} Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.952416 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g4g4" event={"ID":"313057e3-6498-45e3-a93f-bcba978656f6","Type":"ContainerDied","Data":"b3cda92f617897ab3b3e1e3142b200b05e86c47ca070b60b6db9c4a88c129331"} Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.952438 4743 scope.go:117] "RemoveContainer" containerID="b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.952459 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g4g4" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.994137 4743 scope.go:117] "RemoveContainer" containerID="17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68" Mar 10 15:54:15 crc kubenswrapper[4743]: I0310 15:54:15.995677 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5g4g4"] Mar 10 15:54:16 crc kubenswrapper[4743]: I0310 15:54:16.005273 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5g4g4"] Mar 10 15:54:16 crc kubenswrapper[4743]: I0310 15:54:16.058310 4743 scope.go:117] "RemoveContainer" containerID="bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b" Mar 10 15:54:16 crc kubenswrapper[4743]: I0310 15:54:16.084021 4743 scope.go:117] "RemoveContainer" containerID="b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c" Mar 10 15:54:16 crc kubenswrapper[4743]: E0310 15:54:16.084650 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c\": container with ID starting with b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c not found: ID does not exist" containerID="b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c" Mar 10 15:54:16 crc kubenswrapper[4743]: I0310 15:54:16.084716 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c"} err="failed to get container status \"b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c\": rpc error: code = NotFound desc = could not find container \"b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c\": container with ID starting with b5eef72c5ca833d363a2ee465bd8306764623761f7310090d599aeffd6160d3c not found: ID does not exist" Mar 10 15:54:16 crc kubenswrapper[4743]: I0310 15:54:16.084737 4743 scope.go:117] "RemoveContainer" containerID="17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68" Mar 10 15:54:16 crc kubenswrapper[4743]: E0310 15:54:16.085306 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68\": container with ID starting with 17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68 not found: ID does not exist" containerID="17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68" Mar 10 15:54:16 crc kubenswrapper[4743]: I0310 15:54:16.085351 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68"} err="failed to get container status \"17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68\": rpc error: code = NotFound desc = could not find container \"17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68\": container with ID starting with 17be5e6c4251827f5a009c99bf6d6646b7f0055fabb15899315210ed6220ac68 not found: ID does not exist" Mar 10 15:54:16 crc kubenswrapper[4743]: I0310 15:54:16.085381 4743 scope.go:117] "RemoveContainer" containerID="bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b" Mar 10 15:54:16 crc kubenswrapper[4743]: E0310 15:54:16.085771 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b\": container with ID starting with bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b not found: ID does not exist" containerID="bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b" Mar 10 15:54:16 crc kubenswrapper[4743]: I0310 15:54:16.085795 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b"} err="failed to get container status \"bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b\": rpc error: code = NotFound desc = could not find container \"bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b\": container with ID starting with bc4e21ffcf715485ba1e318e2c0ae47d2f9b49975d53c34d68b888d14259640b not found: ID does not exist" Mar 10 15:54:17 crc kubenswrapper[4743]: I0310 15:54:17.927651 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313057e3-6498-45e3-a93f-bcba978656f6" path="/var/lib/kubelet/pods/313057e3-6498-45e3-a93f-bcba978656f6/volumes" Mar 10 15:54:41 crc kubenswrapper[4743]: I0310 15:54:41.253242 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:54:41 crc kubenswrapper[4743]: I0310 15:54:41.254201 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:55:11 crc kubenswrapper[4743]: I0310 15:55:11.252981 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:55:11 crc kubenswrapper[4743]: I0310 15:55:11.253427 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.252316 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.252927 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.252981 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.253845 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.253911 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" gracePeriod=600 Mar 10 15:55:41 crc kubenswrapper[4743]: E0310 15:55:41.385195 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.760019 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" exitCode=0 Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.760074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e"} Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.760114 4743 scope.go:117] "RemoveContainer" containerID="9e7a3799d3537217be6b939fa0048b4e6f8b92c185d54b11efba8d466671befc" Mar 10 15:55:41 crc kubenswrapper[4743]: I0310 15:55:41.760955 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:55:41 crc kubenswrapper[4743]: E0310 15:55:41.761359 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:55:52 crc kubenswrapper[4743]: I0310 15:55:52.915518 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:55:52 crc kubenswrapper[4743]: E0310 15:55:52.916207 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.148078 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552636-72t2c"] Mar 10 15:56:00 crc kubenswrapper[4743]: E0310 15:56:00.149167 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="extract-content" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.149184 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="extract-content" Mar 10 15:56:00 crc kubenswrapper[4743]: E0310 15:56:00.149212 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="registry-server" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.149220 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="registry-server" Mar 10 15:56:00 crc kubenswrapper[4743]: E0310 15:56:00.149236 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="extract-utilities" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.149246 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="extract-utilities" Mar 10 15:56:00 crc kubenswrapper[4743]: E0310 15:56:00.149275 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa770b16-0f11-44ce-9565-67292e1e62ca" containerName="oc" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.149283 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa770b16-0f11-44ce-9565-67292e1e62ca" containerName="oc" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.149526 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="313057e3-6498-45e3-a93f-bcba978656f6" containerName="registry-server" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.149559 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa770b16-0f11-44ce-9565-67292e1e62ca" containerName="oc" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.150388 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-72t2c" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.152855 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.152887 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.155832 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.158651 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-72t2c"] Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.321956 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c77vs\" (UniqueName: \"kubernetes.io/projected/def90d0b-f33b-4496-b98c-ebe7cfcf9e9f-kube-api-access-c77vs\") pod \"auto-csr-approver-29552636-72t2c\" (UID: \"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f\") " pod="openshift-infra/auto-csr-approver-29552636-72t2c" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.423744 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c77vs\" (UniqueName: \"kubernetes.io/projected/def90d0b-f33b-4496-b98c-ebe7cfcf9e9f-kube-api-access-c77vs\") pod \"auto-csr-approver-29552636-72t2c\" (UID: \"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f\") " pod="openshift-infra/auto-csr-approver-29552636-72t2c" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.442239 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c77vs\" (UniqueName: \"kubernetes.io/projected/def90d0b-f33b-4496-b98c-ebe7cfcf9e9f-kube-api-access-c77vs\") pod \"auto-csr-approver-29552636-72t2c\" (UID: \"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f\") " pod="openshift-infra/auto-csr-approver-29552636-72t2c" Mar 10 15:56:00 crc kubenswrapper[4743]: I0310 15:56:00.514200 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-72t2c" Mar 10 15:56:01 crc kubenswrapper[4743]: I0310 15:56:01.091985 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-72t2c"] Mar 10 15:56:01 crc kubenswrapper[4743]: I0310 15:56:01.560602 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-72t2c" event={"ID":"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f","Type":"ContainerStarted","Data":"a27c37de0a799062fd3a2fa57ae8fd05c0c5e161d53c5d835958c1a5f39bf664"} Mar 10 15:56:02 crc kubenswrapper[4743]: I0310 15:56:02.584302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-72t2c" event={"ID":"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f","Type":"ContainerStarted","Data":"20669d201fefe4e6d486ac5cefb27b65aa1b98b0081a5d9611138b653a64a856"} Mar 10 15:56:02 crc kubenswrapper[4743]: I0310 15:56:02.603684 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552636-72t2c" podStartSLOduration=1.457571509 podStartE2EDuration="2.60366558s" podCreationTimestamp="2026-03-10 15:56:00 +0000 UTC" firstStartedPulling="2026-03-10 15:56:01.097669103 +0000 UTC m=+3025.804483851" lastFinishedPulling="2026-03-10 15:56:02.243763174 +0000 UTC m=+3026.950577922" observedRunningTime="2026-03-10 15:56:02.602108276 +0000 UTC m=+3027.308923034" watchObservedRunningTime="2026-03-10 15:56:02.60366558 +0000 UTC m=+3027.310480318" Mar 10 15:56:03 crc kubenswrapper[4743]: I0310 15:56:03.594632 4743 generic.go:334] "Generic (PLEG): container finished" podID="def90d0b-f33b-4496-b98c-ebe7cfcf9e9f" containerID="20669d201fefe4e6d486ac5cefb27b65aa1b98b0081a5d9611138b653a64a856" exitCode=0 Mar 10 15:56:03 crc kubenswrapper[4743]: I0310 15:56:03.594851 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-72t2c" event={"ID":"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f","Type":"ContainerDied","Data":"20669d201fefe4e6d486ac5cefb27b65aa1b98b0081a5d9611138b653a64a856"} Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.167398 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-72t2c" Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.331303 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c77vs\" (UniqueName: \"kubernetes.io/projected/def90d0b-f33b-4496-b98c-ebe7cfcf9e9f-kube-api-access-c77vs\") pod \"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f\" (UID: \"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f\") " Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.337614 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def90d0b-f33b-4496-b98c-ebe7cfcf9e9f-kube-api-access-c77vs" (OuterVolumeSpecName: "kube-api-access-c77vs") pod "def90d0b-f33b-4496-b98c-ebe7cfcf9e9f" (UID: "def90d0b-f33b-4496-b98c-ebe7cfcf9e9f"). InnerVolumeSpecName "kube-api-access-c77vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.433515 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c77vs\" (UniqueName: \"kubernetes.io/projected/def90d0b-f33b-4496-b98c-ebe7cfcf9e9f-kube-api-access-c77vs\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.613379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-72t2c" event={"ID":"def90d0b-f33b-4496-b98c-ebe7cfcf9e9f","Type":"ContainerDied","Data":"a27c37de0a799062fd3a2fa57ae8fd05c0c5e161d53c5d835958c1a5f39bf664"} Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.613429 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a27c37de0a799062fd3a2fa57ae8fd05c0c5e161d53c5d835958c1a5f39bf664" Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.613450 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-72t2c" Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.676271 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-bldfb"] Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.686755 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-bldfb"] Mar 10 15:56:05 crc kubenswrapper[4743]: I0310 15:56:05.928388 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c505d23-bed2-4d65-864f-47bda8527537" path="/var/lib/kubelet/pods/8c505d23-bed2-4d65-864f-47bda8527537/volumes" Mar 10 15:56:06 crc kubenswrapper[4743]: I0310 15:56:06.915862 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:56:06 crc kubenswrapper[4743]: E0310 15:56:06.916148 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:56:14 crc kubenswrapper[4743]: I0310 15:56:14.947801 4743 scope.go:117] "RemoveContainer" containerID="412b28d98ea25215432eaaeed307e4fdf4211734e47ee766123617d12cb9b83a" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.140998 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xn6rp"] Mar 10 15:56:18 crc kubenswrapper[4743]: E0310 15:56:18.141645 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def90d0b-f33b-4496-b98c-ebe7cfcf9e9f" containerName="oc" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.141662 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="def90d0b-f33b-4496-b98c-ebe7cfcf9e9f" containerName="oc" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.145073 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="def90d0b-f33b-4496-b98c-ebe7cfcf9e9f" containerName="oc" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.146499 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.157422 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn6rp"] Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.233553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-catalog-content\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.233900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-utilities\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.233955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svksc\" (UniqueName: \"kubernetes.io/projected/d07cd290-6217-42a6-a22c-eb68211e2db2-kube-api-access-svksc\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.335873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-catalog-content\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.336011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-utilities\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.336055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svksc\" (UniqueName: \"kubernetes.io/projected/d07cd290-6217-42a6-a22c-eb68211e2db2-kube-api-access-svksc\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.336491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-utilities\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.336493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-catalog-content\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.354981 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svksc\" (UniqueName: \"kubernetes.io/projected/d07cd290-6217-42a6-a22c-eb68211e2db2-kube-api-access-svksc\") pod \"redhat-marketplace-xn6rp\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.479798 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:18 crc kubenswrapper[4743]: I0310 15:56:18.983668 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn6rp"] Mar 10 15:56:19 crc kubenswrapper[4743]: I0310 15:56:19.753629 4743 generic.go:334] "Generic (PLEG): container finished" podID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerID="0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49" exitCode=0 Mar 10 15:56:19 crc kubenswrapper[4743]: I0310 15:56:19.754060 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn6rp" event={"ID":"d07cd290-6217-42a6-a22c-eb68211e2db2","Type":"ContainerDied","Data":"0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49"} Mar 10 15:56:19 crc kubenswrapper[4743]: I0310 15:56:19.754093 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn6rp" event={"ID":"d07cd290-6217-42a6-a22c-eb68211e2db2","Type":"ContainerStarted","Data":"dd784a66c4aa140a079578326f5d200a90848f4f733b903ee079fbd14f561c5d"} Mar 10 15:56:19 crc kubenswrapper[4743]: I0310 15:56:19.915754 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:56:19 crc kubenswrapper[4743]: E0310 15:56:19.916128 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:56:20 crc kubenswrapper[4743]: I0310 15:56:20.765578 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn6rp" event={"ID":"d07cd290-6217-42a6-a22c-eb68211e2db2","Type":"ContainerStarted","Data":"5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac"} Mar 10 15:56:21 crc kubenswrapper[4743]: I0310 15:56:21.799276 4743 generic.go:334] "Generic (PLEG): container finished" podID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerID="5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac" exitCode=0 Mar 10 15:56:21 crc kubenswrapper[4743]: I0310 15:56:21.799449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn6rp" event={"ID":"d07cd290-6217-42a6-a22c-eb68211e2db2","Type":"ContainerDied","Data":"5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac"} Mar 10 15:56:22 crc kubenswrapper[4743]: I0310 15:56:22.810935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn6rp" event={"ID":"d07cd290-6217-42a6-a22c-eb68211e2db2","Type":"ContainerStarted","Data":"0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869"} Mar 10 15:56:22 crc kubenswrapper[4743]: I0310 15:56:22.838557 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xn6rp" podStartSLOduration=2.41051588 podStartE2EDuration="4.838535985s" podCreationTimestamp="2026-03-10 15:56:18 +0000 UTC" firstStartedPulling="2026-03-10 15:56:19.759047936 +0000 UTC m=+3044.465862684" lastFinishedPulling="2026-03-10 15:56:22.187068041 +0000 UTC m=+3046.893882789" observedRunningTime="2026-03-10 15:56:22.834872902 +0000 UTC m=+3047.541687670" watchObservedRunningTime="2026-03-10 15:56:22.838535985 +0000 UTC m=+3047.545350743" Mar 10 15:56:28 crc kubenswrapper[4743]: I0310 15:56:28.480764 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:28 crc kubenswrapper[4743]: I0310 15:56:28.481282 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:28 crc kubenswrapper[4743]: I0310 15:56:28.532741 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:28 crc kubenswrapper[4743]: I0310 15:56:28.923429 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:28 crc kubenswrapper[4743]: I0310 15:56:28.996361 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn6rp"] Mar 10 15:56:30 crc kubenswrapper[4743]: I0310 15:56:30.886346 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xn6rp" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerName="registry-server" containerID="cri-o://0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869" gracePeriod=2 Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.641649 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.753925 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svksc\" (UniqueName: \"kubernetes.io/projected/d07cd290-6217-42a6-a22c-eb68211e2db2-kube-api-access-svksc\") pod \"d07cd290-6217-42a6-a22c-eb68211e2db2\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.754347 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-catalog-content\") pod \"d07cd290-6217-42a6-a22c-eb68211e2db2\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.754385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-utilities\") pod \"d07cd290-6217-42a6-a22c-eb68211e2db2\" (UID: \"d07cd290-6217-42a6-a22c-eb68211e2db2\") " Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.755643 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-utilities" (OuterVolumeSpecName: "utilities") pod "d07cd290-6217-42a6-a22c-eb68211e2db2" (UID: "d07cd290-6217-42a6-a22c-eb68211e2db2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.767087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d07cd290-6217-42a6-a22c-eb68211e2db2-kube-api-access-svksc" (OuterVolumeSpecName: "kube-api-access-svksc") pod "d07cd290-6217-42a6-a22c-eb68211e2db2" (UID: "d07cd290-6217-42a6-a22c-eb68211e2db2"). InnerVolumeSpecName "kube-api-access-svksc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.786082 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d07cd290-6217-42a6-a22c-eb68211e2db2" (UID: "d07cd290-6217-42a6-a22c-eb68211e2db2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.856778 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svksc\" (UniqueName: \"kubernetes.io/projected/d07cd290-6217-42a6-a22c-eb68211e2db2-kube-api-access-svksc\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.856850 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.856862 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07cd290-6217-42a6-a22c-eb68211e2db2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.896425 4743 generic.go:334] "Generic (PLEG): container finished" podID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerID="0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869" exitCode=0 Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.896484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn6rp" event={"ID":"d07cd290-6217-42a6-a22c-eb68211e2db2","Type":"ContainerDied","Data":"0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869"} Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.896512 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xn6rp" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.896523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn6rp" event={"ID":"d07cd290-6217-42a6-a22c-eb68211e2db2","Type":"ContainerDied","Data":"dd784a66c4aa140a079578326f5d200a90848f4f733b903ee079fbd14f561c5d"} Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.896547 4743 scope.go:117] "RemoveContainer" containerID="0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.928004 4743 scope.go:117] "RemoveContainer" containerID="5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac" Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.949371 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn6rp"] Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.962773 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn6rp"] Mar 10 15:56:31 crc kubenswrapper[4743]: I0310 15:56:31.965935 4743 scope.go:117] "RemoveContainer" containerID="0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49" Mar 10 15:56:32 crc kubenswrapper[4743]: I0310 15:56:32.014037 4743 scope.go:117] "RemoveContainer" containerID="0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869" Mar 10 15:56:32 crc kubenswrapper[4743]: E0310 15:56:32.020471 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869\": container with ID starting with 0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869 not found: ID does not exist" containerID="0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869" Mar 10 15:56:32 crc kubenswrapper[4743]: I0310 15:56:32.020540 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869"} err="failed to get container status \"0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869\": rpc error: code = NotFound desc = could not find container \"0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869\": container with ID starting with 0ac7752d3ea88f2f898025a91ecaa18bd66c6f71d12d2613d10dbfa54aecd869 not found: ID does not exist" Mar 10 15:56:32 crc kubenswrapper[4743]: I0310 15:56:32.020568 4743 scope.go:117] "RemoveContainer" containerID="5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac" Mar 10 15:56:32 crc kubenswrapper[4743]: E0310 15:56:32.021077 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac\": container with ID starting with 5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac not found: ID does not exist" containerID="5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac" Mar 10 15:56:32 crc kubenswrapper[4743]: I0310 15:56:32.021126 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac"} err="failed to get container status \"5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac\": rpc error: code = NotFound desc = could not find container \"5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac\": container with ID starting with 5c00a525d47c95bc96df2929abdf6edddeb2d518d2291c5e6463b97f13c593ac not found: ID does not exist" Mar 10 15:56:32 crc kubenswrapper[4743]: I0310 15:56:32.021157 4743 scope.go:117] "RemoveContainer" containerID="0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49" Mar 10 15:56:32 crc kubenswrapper[4743]: E0310 15:56:32.021664 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49\": container with ID starting with 0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49 not found: ID does not exist" containerID="0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49" Mar 10 15:56:32 crc kubenswrapper[4743]: I0310 15:56:32.021700 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49"} err="failed to get container status \"0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49\": rpc error: code = NotFound desc = could not find container \"0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49\": container with ID starting with 0b1262681e7dccdc94e5c2eaaa6d53db79d0a0e39c83f4f54896768399617f49 not found: ID does not exist" Mar 10 15:56:33 crc kubenswrapper[4743]: I0310 15:56:33.915663 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:56:33 crc kubenswrapper[4743]: E0310 15:56:33.916922 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:56:33 crc kubenswrapper[4743]: I0310 15:56:33.926870 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" path="/var/lib/kubelet/pods/d07cd290-6217-42a6-a22c-eb68211e2db2/volumes" Mar 10 15:56:45 crc kubenswrapper[4743]: I0310 15:56:45.932632 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:56:45 crc kubenswrapper[4743]: E0310 15:56:45.933259 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:56:57 crc kubenswrapper[4743]: I0310 15:56:57.915773 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:56:57 crc kubenswrapper[4743]: E0310 15:56:57.916522 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:57:11 crc kubenswrapper[4743]: I0310 15:57:11.916569 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:57:11 crc kubenswrapper[4743]: E0310 15:57:11.917367 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:57:23 crc kubenswrapper[4743]: I0310 15:57:23.922513 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:57:23 crc kubenswrapper[4743]: E0310 15:57:23.924272 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:57:38 crc kubenswrapper[4743]: I0310 15:57:38.915692 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:57:38 crc kubenswrapper[4743]: E0310 15:57:38.916425 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:57:53 crc kubenswrapper[4743]: I0310 15:57:53.915910 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:57:53 crc kubenswrapper[4743]: E0310 15:57:53.916666 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.145755 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552638-rczk6"] Mar 10 15:58:00 crc kubenswrapper[4743]: E0310 15:58:00.146928 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerName="extract-content" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.146946 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerName="extract-content" Mar 10 15:58:00 crc kubenswrapper[4743]: E0310 15:58:00.146969 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerName="extract-utilities" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.146978 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerName="extract-utilities" Mar 10 15:58:00 crc kubenswrapper[4743]: E0310 15:58:00.146985 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerName="registry-server" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.146993 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerName="registry-server" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.147271 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d07cd290-6217-42a6-a22c-eb68211e2db2" containerName="registry-server" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.148228 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-rczk6" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.150609 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.150805 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.152609 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.189471 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-rczk6"] Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.287153 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lttl4\" (UniqueName: \"kubernetes.io/projected/6083869d-3374-4da7-bac1-1ee28af64e17-kube-api-access-lttl4\") pod \"auto-csr-approver-29552638-rczk6\" (UID: \"6083869d-3374-4da7-bac1-1ee28af64e17\") " pod="openshift-infra/auto-csr-approver-29552638-rczk6" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.388702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lttl4\" (UniqueName: \"kubernetes.io/projected/6083869d-3374-4da7-bac1-1ee28af64e17-kube-api-access-lttl4\") pod \"auto-csr-approver-29552638-rczk6\" (UID: \"6083869d-3374-4da7-bac1-1ee28af64e17\") " pod="openshift-infra/auto-csr-approver-29552638-rczk6" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.408332 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lttl4\" (UniqueName: \"kubernetes.io/projected/6083869d-3374-4da7-bac1-1ee28af64e17-kube-api-access-lttl4\") pod \"auto-csr-approver-29552638-rczk6\" (UID: \"6083869d-3374-4da7-bac1-1ee28af64e17\") " pod="openshift-infra/auto-csr-approver-29552638-rczk6" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.495616 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-rczk6" Mar 10 15:58:00 crc kubenswrapper[4743]: I0310 15:58:00.969299 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-rczk6"] Mar 10 15:58:01 crc kubenswrapper[4743]: I0310 15:58:01.024441 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-rczk6" event={"ID":"6083869d-3374-4da7-bac1-1ee28af64e17","Type":"ContainerStarted","Data":"a0c2d5992ec65d650d3e5cdb7ab020b097e633db95d3fed81d2cda99bbc601a9"} Mar 10 15:58:03 crc kubenswrapper[4743]: I0310 15:58:03.045198 4743 generic.go:334] "Generic (PLEG): container finished" podID="6083869d-3374-4da7-bac1-1ee28af64e17" containerID="7af1b6894f231304931c4a7efc75739f9240d79991194d664beefa9289d8ed6d" exitCode=0 Mar 10 15:58:03 crc kubenswrapper[4743]: I0310 15:58:03.045250 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-rczk6" event={"ID":"6083869d-3374-4da7-bac1-1ee28af64e17","Type":"ContainerDied","Data":"7af1b6894f231304931c4a7efc75739f9240d79991194d664beefa9289d8ed6d"} Mar 10 15:58:04 crc kubenswrapper[4743]: I0310 15:58:04.738267 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-rczk6" Mar 10 15:58:04 crc kubenswrapper[4743]: I0310 15:58:04.875282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lttl4\" (UniqueName: \"kubernetes.io/projected/6083869d-3374-4da7-bac1-1ee28af64e17-kube-api-access-lttl4\") pod \"6083869d-3374-4da7-bac1-1ee28af64e17\" (UID: \"6083869d-3374-4da7-bac1-1ee28af64e17\") " Mar 10 15:58:04 crc kubenswrapper[4743]: I0310 15:58:04.881831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6083869d-3374-4da7-bac1-1ee28af64e17-kube-api-access-lttl4" (OuterVolumeSpecName: "kube-api-access-lttl4") pod "6083869d-3374-4da7-bac1-1ee28af64e17" (UID: "6083869d-3374-4da7-bac1-1ee28af64e17"). InnerVolumeSpecName "kube-api-access-lttl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:58:04 crc kubenswrapper[4743]: I0310 15:58:04.915105 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:58:04 crc kubenswrapper[4743]: E0310 15:58:04.915602 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:58:04 crc kubenswrapper[4743]: I0310 15:58:04.977642 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lttl4\" (UniqueName: \"kubernetes.io/projected/6083869d-3374-4da7-bac1-1ee28af64e17-kube-api-access-lttl4\") on node \"crc\" DevicePath \"\"" Mar 10 15:58:05 crc kubenswrapper[4743]: I0310 15:58:05.067167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-rczk6" event={"ID":"6083869d-3374-4da7-bac1-1ee28af64e17","Type":"ContainerDied","Data":"a0c2d5992ec65d650d3e5cdb7ab020b097e633db95d3fed81d2cda99bbc601a9"} Mar 10 15:58:05 crc kubenswrapper[4743]: I0310 15:58:05.067215 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c2d5992ec65d650d3e5cdb7ab020b097e633db95d3fed81d2cda99bbc601a9" Mar 10 15:58:05 crc kubenswrapper[4743]: I0310 15:58:05.067299 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-rczk6" Mar 10 15:58:05 crc kubenswrapper[4743]: I0310 15:58:05.808108 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-sz2wx"] Mar 10 15:58:05 crc kubenswrapper[4743]: I0310 15:58:05.820290 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-sz2wx"] Mar 10 15:58:05 crc kubenswrapper[4743]: I0310 15:58:05.934681 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18725857-3c76-4bc6-8dd5-acd071f2d26a" path="/var/lib/kubelet/pods/18725857-3c76-4bc6-8dd5-acd071f2d26a/volumes" Mar 10 15:58:15 crc kubenswrapper[4743]: I0310 15:58:15.074916 4743 scope.go:117] "RemoveContainer" containerID="4458ada4c24becd9201554722d5265b1ff6689ed5fcba3d124da9e841a3f9a50" Mar 10 15:58:15 crc kubenswrapper[4743]: I0310 15:58:15.926922 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:58:15 crc kubenswrapper[4743]: E0310 15:58:15.927488 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:58:29 crc kubenswrapper[4743]: I0310 15:58:29.915246 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:58:29 crc kubenswrapper[4743]: E0310 15:58:29.915898 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:58:44 crc kubenswrapper[4743]: I0310 15:58:44.917465 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:58:44 crc kubenswrapper[4743]: E0310 15:58:44.918446 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:58:56 crc kubenswrapper[4743]: I0310 15:58:56.916589 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:58:56 crc kubenswrapper[4743]: E0310 15:58:56.918465 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:59:08 crc kubenswrapper[4743]: I0310 15:59:08.917282 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:59:08 crc kubenswrapper[4743]: E0310 15:59:08.918561 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:59:22 crc kubenswrapper[4743]: I0310 15:59:22.915047 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:59:22 crc kubenswrapper[4743]: E0310 15:59:22.915727 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:59:36 crc kubenswrapper[4743]: I0310 15:59:36.915416 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:59:36 crc kubenswrapper[4743]: E0310 15:59:36.916181 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 15:59:49 crc kubenswrapper[4743]: I0310 15:59:49.924741 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 15:59:49 crc kubenswrapper[4743]: E0310 15:59:49.925775 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.179509 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552640-5ncfn"] Mar 10 16:00:00 crc kubenswrapper[4743]: E0310 16:00:00.180657 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6083869d-3374-4da7-bac1-1ee28af64e17" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.180678 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6083869d-3374-4da7-bac1-1ee28af64e17" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.180983 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6083869d-3374-4da7-bac1-1ee28af64e17" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.181822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-5ncfn" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.183925 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.184505 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.184671 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.193273 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w"] Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.194715 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.197039 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.197265 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.216440 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-5ncfn"] Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.247042 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w"] Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.291322 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktf86\" (UniqueName: \"kubernetes.io/projected/59e8fd61-7218-4529-88c6-3567ad2d2756-kube-api-access-ktf86\") pod \"auto-csr-approver-29552640-5ncfn\" (UID: \"59e8fd61-7218-4529-88c6-3567ad2d2756\") " pod="openshift-infra/auto-csr-approver-29552640-5ncfn" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.291783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e920099b-06e6-44f2-a157-98c5330dc116-secret-volume\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.292264 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e920099b-06e6-44f2-a157-98c5330dc116-config-volume\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.292587 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdr5w\" (UniqueName: \"kubernetes.io/projected/e920099b-06e6-44f2-a157-98c5330dc116-kube-api-access-pdr5w\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.394342 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e920099b-06e6-44f2-a157-98c5330dc116-secret-volume\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.394480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e920099b-06e6-44f2-a157-98c5330dc116-config-volume\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.394668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdr5w\" (UniqueName: \"kubernetes.io/projected/e920099b-06e6-44f2-a157-98c5330dc116-kube-api-access-pdr5w\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.394867 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktf86\" (UniqueName: \"kubernetes.io/projected/59e8fd61-7218-4529-88c6-3567ad2d2756-kube-api-access-ktf86\") pod \"auto-csr-approver-29552640-5ncfn\" (UID: \"59e8fd61-7218-4529-88c6-3567ad2d2756\") " pod="openshift-infra/auto-csr-approver-29552640-5ncfn" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.395655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e920099b-06e6-44f2-a157-98c5330dc116-config-volume\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.403491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e920099b-06e6-44f2-a157-98c5330dc116-secret-volume\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.413199 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktf86\" (UniqueName: \"kubernetes.io/projected/59e8fd61-7218-4529-88c6-3567ad2d2756-kube-api-access-ktf86\") pod \"auto-csr-approver-29552640-5ncfn\" (UID: \"59e8fd61-7218-4529-88c6-3567ad2d2756\") " pod="openshift-infra/auto-csr-approver-29552640-5ncfn" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.413408 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdr5w\" (UniqueName: \"kubernetes.io/projected/e920099b-06e6-44f2-a157-98c5330dc116-kube-api-access-pdr5w\") pod \"collect-profiles-29552640-kj92w\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.502703 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-5ncfn" Mar 10 16:00:00 crc kubenswrapper[4743]: I0310 16:00:00.514831 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:01 crc kubenswrapper[4743]: I0310 16:00:01.016705 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:00:01 crc kubenswrapper[4743]: I0310 16:00:01.026775 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-5ncfn"] Mar 10 16:00:01 crc kubenswrapper[4743]: I0310 16:00:01.178080 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w"] Mar 10 16:00:01 crc kubenswrapper[4743]: I0310 16:00:01.423359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" event={"ID":"e920099b-06e6-44f2-a157-98c5330dc116","Type":"ContainerStarted","Data":"da2da16b47bf0d7aac22c8cd095351a67e6ebfc27003146c015c6365d69ca189"} Mar 10 16:00:01 crc kubenswrapper[4743]: I0310 16:00:01.423791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" event={"ID":"e920099b-06e6-44f2-a157-98c5330dc116","Type":"ContainerStarted","Data":"f87a6175ceaa3decb667e6dd800a8f3c304e0ccab6d06a0d206e9ae4ce1b3c07"} Mar 10 16:00:01 crc kubenswrapper[4743]: I0310 16:00:01.425293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-5ncfn" event={"ID":"59e8fd61-7218-4529-88c6-3567ad2d2756","Type":"ContainerStarted","Data":"a7c81711cf2ad3ebe16b0928d461c436b00ea6e83da5068970ba8fe4c0c31eaf"} Mar 10 16:00:01 crc kubenswrapper[4743]: I0310 16:00:01.447191 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" podStartSLOduration=1.447166763 podStartE2EDuration="1.447166763s" podCreationTimestamp="2026-03-10 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:00:01.444457527 +0000 UTC m=+3266.151272275" watchObservedRunningTime="2026-03-10 16:00:01.447166763 +0000 UTC m=+3266.153981521" Mar 10 16:00:02 crc kubenswrapper[4743]: I0310 16:00:02.436858 4743 generic.go:334] "Generic (PLEG): container finished" podID="e920099b-06e6-44f2-a157-98c5330dc116" containerID="da2da16b47bf0d7aac22c8cd095351a67e6ebfc27003146c015c6365d69ca189" exitCode=0 Mar 10 16:00:02 crc kubenswrapper[4743]: I0310 16:00:02.437174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" event={"ID":"e920099b-06e6-44f2-a157-98c5330dc116","Type":"ContainerDied","Data":"da2da16b47bf0d7aac22c8cd095351a67e6ebfc27003146c015c6365d69ca189"} Mar 10 16:00:02 crc kubenswrapper[4743]: I0310 16:00:02.916264 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 16:00:02 crc kubenswrapper[4743]: E0310 16:00:02.916626 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.127673 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.174356 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdr5w\" (UniqueName: \"kubernetes.io/projected/e920099b-06e6-44f2-a157-98c5330dc116-kube-api-access-pdr5w\") pod \"e920099b-06e6-44f2-a157-98c5330dc116\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.174533 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e920099b-06e6-44f2-a157-98c5330dc116-config-volume\") pod \"e920099b-06e6-44f2-a157-98c5330dc116\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.174728 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e920099b-06e6-44f2-a157-98c5330dc116-secret-volume\") pod \"e920099b-06e6-44f2-a157-98c5330dc116\" (UID: \"e920099b-06e6-44f2-a157-98c5330dc116\") " Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.175408 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e920099b-06e6-44f2-a157-98c5330dc116-config-volume" (OuterVolumeSpecName: "config-volume") pod "e920099b-06e6-44f2-a157-98c5330dc116" (UID: "e920099b-06e6-44f2-a157-98c5330dc116"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.181195 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e920099b-06e6-44f2-a157-98c5330dc116-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e920099b-06e6-44f2-a157-98c5330dc116" (UID: "e920099b-06e6-44f2-a157-98c5330dc116"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.194161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e920099b-06e6-44f2-a157-98c5330dc116-kube-api-access-pdr5w" (OuterVolumeSpecName: "kube-api-access-pdr5w") pod "e920099b-06e6-44f2-a157-98c5330dc116" (UID: "e920099b-06e6-44f2-a157-98c5330dc116"). InnerVolumeSpecName "kube-api-access-pdr5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.277727 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e920099b-06e6-44f2-a157-98c5330dc116-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.277774 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdr5w\" (UniqueName: \"kubernetes.io/projected/e920099b-06e6-44f2-a157-98c5330dc116-kube-api-access-pdr5w\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.277786 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e920099b-06e6-44f2-a157-98c5330dc116-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.460063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" event={"ID":"e920099b-06e6-44f2-a157-98c5330dc116","Type":"ContainerDied","Data":"f87a6175ceaa3decb667e6dd800a8f3c304e0ccab6d06a0d206e9ae4ce1b3c07"} Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.460340 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-kj92w" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.460430 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f87a6175ceaa3decb667e6dd800a8f3c304e0ccab6d06a0d206e9ae4ce1b3c07" Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.518999 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2"] Mar 10 16:00:04 crc kubenswrapper[4743]: I0310 16:00:04.527530 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-vq8s2"] Mar 10 16:00:05 crc kubenswrapper[4743]: I0310 16:00:05.470916 4743 generic.go:334] "Generic (PLEG): container finished" podID="59e8fd61-7218-4529-88c6-3567ad2d2756" containerID="ddf84644cad462c50115de2dad6f445c8c860663c3269ab2537303cf5020a383" exitCode=0 Mar 10 16:00:05 crc kubenswrapper[4743]: I0310 16:00:05.471060 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-5ncfn" event={"ID":"59e8fd61-7218-4529-88c6-3567ad2d2756","Type":"ContainerDied","Data":"ddf84644cad462c50115de2dad6f445c8c860663c3269ab2537303cf5020a383"} Mar 10 16:00:05 crc kubenswrapper[4743]: I0310 16:00:05.929773 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79af8e51-8ca1-41a0-9abc-3c7c33e01000" path="/var/lib/kubelet/pods/79af8e51-8ca1-41a0-9abc-3c7c33e01000/volumes" Mar 10 16:00:07 crc kubenswrapper[4743]: I0310 16:00:07.084152 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-5ncfn" Mar 10 16:00:07 crc kubenswrapper[4743]: I0310 16:00:07.235201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktf86\" (UniqueName: \"kubernetes.io/projected/59e8fd61-7218-4529-88c6-3567ad2d2756-kube-api-access-ktf86\") pod \"59e8fd61-7218-4529-88c6-3567ad2d2756\" (UID: \"59e8fd61-7218-4529-88c6-3567ad2d2756\") " Mar 10 16:00:07 crc kubenswrapper[4743]: I0310 16:00:07.241240 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e8fd61-7218-4529-88c6-3567ad2d2756-kube-api-access-ktf86" (OuterVolumeSpecName: "kube-api-access-ktf86") pod "59e8fd61-7218-4529-88c6-3567ad2d2756" (UID: "59e8fd61-7218-4529-88c6-3567ad2d2756"). InnerVolumeSpecName "kube-api-access-ktf86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:00:07 crc kubenswrapper[4743]: I0310 16:00:07.338229 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktf86\" (UniqueName: \"kubernetes.io/projected/59e8fd61-7218-4529-88c6-3567ad2d2756-kube-api-access-ktf86\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:07 crc kubenswrapper[4743]: I0310 16:00:07.493541 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-5ncfn" event={"ID":"59e8fd61-7218-4529-88c6-3567ad2d2756","Type":"ContainerDied","Data":"a7c81711cf2ad3ebe16b0928d461c436b00ea6e83da5068970ba8fe4c0c31eaf"} Mar 10 16:00:07 crc kubenswrapper[4743]: I0310 16:00:07.493887 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7c81711cf2ad3ebe16b0928d461c436b00ea6e83da5068970ba8fe4c0c31eaf" Mar 10 16:00:07 crc kubenswrapper[4743]: I0310 16:00:07.493626 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-5ncfn" Mar 10 16:00:08 crc kubenswrapper[4743]: I0310 16:00:08.140732 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-nvb5j"] Mar 10 16:00:08 crc kubenswrapper[4743]: I0310 16:00:08.152890 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-nvb5j"] Mar 10 16:00:09 crc kubenswrapper[4743]: I0310 16:00:09.928714 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa770b16-0f11-44ce-9565-67292e1e62ca" path="/var/lib/kubelet/pods/fa770b16-0f11-44ce-9565-67292e1e62ca/volumes" Mar 10 16:00:15 crc kubenswrapper[4743]: I0310 16:00:15.163492 4743 scope.go:117] "RemoveContainer" containerID="557c942064107673d0064b67df108324afaaa44b13040191c8d64bc30024bea0" Mar 10 16:00:15 crc kubenswrapper[4743]: I0310 16:00:15.196643 4743 scope.go:117] "RemoveContainer" containerID="526b2371123323292f8acb871aded42430652b5d046b75edb945c7aaeacd976b" Mar 10 16:00:17 crc kubenswrapper[4743]: I0310 16:00:17.915072 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 16:00:17 crc kubenswrapper[4743]: E0310 16:00:17.915833 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:00:32 crc kubenswrapper[4743]: I0310 16:00:32.916023 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 16:00:32 crc kubenswrapper[4743]: E0310 16:00:32.916918 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:00:46 crc kubenswrapper[4743]: I0310 16:00:46.915930 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 16:00:47 crc kubenswrapper[4743]: I0310 16:00:47.902191 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"405bc24361e3bca0128de2227b390cf21c2c732269df1c9883fdd1a30a8f8e7a"} Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.163950 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29552641-qjhxn"] Mar 10 16:01:00 crc kubenswrapper[4743]: E0310 16:01:00.165021 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e8fd61-7218-4529-88c6-3567ad2d2756" containerName="oc" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.165042 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e8fd61-7218-4529-88c6-3567ad2d2756" containerName="oc" Mar 10 16:01:00 crc kubenswrapper[4743]: E0310 16:01:00.165077 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e920099b-06e6-44f2-a157-98c5330dc116" containerName="collect-profiles" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.165086 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e920099b-06e6-44f2-a157-98c5330dc116" containerName="collect-profiles" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.165326 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e8fd61-7218-4529-88c6-3567ad2d2756" containerName="oc" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.165350 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e920099b-06e6-44f2-a157-98c5330dc116" containerName="collect-profiles" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.166233 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.183175 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552641-qjhxn"] Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.256566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-combined-ca-bundle\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.256921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-config-data\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.256991 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d5m4\" (UniqueName: \"kubernetes.io/projected/87cf92f8-0e04-4abd-b668-40e17f635753-kube-api-access-9d5m4\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.257012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-fernet-keys\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.359404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-combined-ca-bundle\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.359472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-config-data\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.359578 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d5m4\" (UniqueName: \"kubernetes.io/projected/87cf92f8-0e04-4abd-b668-40e17f635753-kube-api-access-9d5m4\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.359605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-fernet-keys\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.368337 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-combined-ca-bundle\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.368556 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-fernet-keys\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.368335 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-config-data\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.384343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d5m4\" (UniqueName: \"kubernetes.io/projected/87cf92f8-0e04-4abd-b668-40e17f635753-kube-api-access-9d5m4\") pod \"keystone-cron-29552641-qjhxn\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:00 crc kubenswrapper[4743]: I0310 16:01:00.488370 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:01 crc kubenswrapper[4743]: I0310 16:01:01.006776 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552641-qjhxn"] Mar 10 16:01:01 crc kubenswrapper[4743]: I0310 16:01:01.038484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552641-qjhxn" event={"ID":"87cf92f8-0e04-4abd-b668-40e17f635753","Type":"ContainerStarted","Data":"ad82538575e87ccade321a5e04dbd5a35af45b9559ab9089d01cd6e2000272af"} Mar 10 16:01:02 crc kubenswrapper[4743]: I0310 16:01:02.057944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552641-qjhxn" event={"ID":"87cf92f8-0e04-4abd-b668-40e17f635753","Type":"ContainerStarted","Data":"157e9da4fe7e4f1cacc486dc99351734b4c73ab43067da77540ce25e9798979d"} Mar 10 16:01:02 crc kubenswrapper[4743]: I0310 16:01:02.076511 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29552641-qjhxn" podStartSLOduration=2.07648682 podStartE2EDuration="2.07648682s" podCreationTimestamp="2026-03-10 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:01:02.073879537 +0000 UTC m=+3326.780694285" watchObservedRunningTime="2026-03-10 16:01:02.07648682 +0000 UTC m=+3326.783301568" Mar 10 16:01:04 crc kubenswrapper[4743]: I0310 16:01:04.098234 4743 generic.go:334] "Generic (PLEG): container finished" podID="87cf92f8-0e04-4abd-b668-40e17f635753" containerID="157e9da4fe7e4f1cacc486dc99351734b4c73ab43067da77540ce25e9798979d" exitCode=0 Mar 10 16:01:04 crc kubenswrapper[4743]: I0310 16:01:04.098462 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552641-qjhxn" event={"ID":"87cf92f8-0e04-4abd-b668-40e17f635753","Type":"ContainerDied","Data":"157e9da4fe7e4f1cacc486dc99351734b4c73ab43067da77540ce25e9798979d"} Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.749051 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.790976 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-fernet-keys\") pod \"87cf92f8-0e04-4abd-b668-40e17f635753\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.791120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d5m4\" (UniqueName: \"kubernetes.io/projected/87cf92f8-0e04-4abd-b668-40e17f635753-kube-api-access-9d5m4\") pod \"87cf92f8-0e04-4abd-b668-40e17f635753\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.791285 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-config-data\") pod \"87cf92f8-0e04-4abd-b668-40e17f635753\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.791998 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-combined-ca-bundle\") pod \"87cf92f8-0e04-4abd-b668-40e17f635753\" (UID: \"87cf92f8-0e04-4abd-b668-40e17f635753\") " Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.822916 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "87cf92f8-0e04-4abd-b668-40e17f635753" (UID: "87cf92f8-0e04-4abd-b668-40e17f635753"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.823145 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf92f8-0e04-4abd-b668-40e17f635753-kube-api-access-9d5m4" (OuterVolumeSpecName: "kube-api-access-9d5m4") pod "87cf92f8-0e04-4abd-b668-40e17f635753" (UID: "87cf92f8-0e04-4abd-b668-40e17f635753"). InnerVolumeSpecName "kube-api-access-9d5m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.853089 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87cf92f8-0e04-4abd-b668-40e17f635753" (UID: "87cf92f8-0e04-4abd-b668-40e17f635753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.870545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-config-data" (OuterVolumeSpecName: "config-data") pod "87cf92f8-0e04-4abd-b668-40e17f635753" (UID: "87cf92f8-0e04-4abd-b668-40e17f635753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.894874 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.894905 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d5m4\" (UniqueName: \"kubernetes.io/projected/87cf92f8-0e04-4abd-b668-40e17f635753-kube-api-access-9d5m4\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.894918 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:05 crc kubenswrapper[4743]: I0310 16:01:05.894927 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87cf92f8-0e04-4abd-b668-40e17f635753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:06 crc kubenswrapper[4743]: I0310 16:01:06.116770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552641-qjhxn" event={"ID":"87cf92f8-0e04-4abd-b668-40e17f635753","Type":"ContainerDied","Data":"ad82538575e87ccade321a5e04dbd5a35af45b9559ab9089d01cd6e2000272af"} Mar 10 16:01:06 crc kubenswrapper[4743]: I0310 16:01:06.116835 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad82538575e87ccade321a5e04dbd5a35af45b9559ab9089d01cd6e2000272af" Mar 10 16:01:06 crc kubenswrapper[4743]: I0310 16:01:06.116902 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552641-qjhxn" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.317788 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4lt5r"] Mar 10 16:01:17 crc kubenswrapper[4743]: E0310 16:01:17.318900 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cf92f8-0e04-4abd-b668-40e17f635753" containerName="keystone-cron" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.318918 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cf92f8-0e04-4abd-b668-40e17f635753" containerName="keystone-cron" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.319163 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="87cf92f8-0e04-4abd-b668-40e17f635753" containerName="keystone-cron" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.320977 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.325636 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4lt5r"] Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.428547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-catalog-content\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.428877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549nl\" (UniqueName: \"kubernetes.io/projected/00b92d71-1e54-499a-a08c-a79dc2dc8b23-kube-api-access-549nl\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.429008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-utilities\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.531016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-catalog-content\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.531316 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-549nl\" (UniqueName: \"kubernetes.io/projected/00b92d71-1e54-499a-a08c-a79dc2dc8b23-kube-api-access-549nl\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.531426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-utilities\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.531980 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-utilities\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.532309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-catalog-content\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.561227 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-549nl\" (UniqueName: \"kubernetes.io/projected/00b92d71-1e54-499a-a08c-a79dc2dc8b23-kube-api-access-549nl\") pod \"certified-operators-4lt5r\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:17 crc kubenswrapper[4743]: I0310 16:01:17.643090 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:18 crc kubenswrapper[4743]: I0310 16:01:18.188169 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4lt5r"] Mar 10 16:01:18 crc kubenswrapper[4743]: I0310 16:01:18.259130 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lt5r" event={"ID":"00b92d71-1e54-499a-a08c-a79dc2dc8b23","Type":"ContainerStarted","Data":"66cabfd4bb5ff6066c568156bb45d3a15fddffa5eb29127bc4e355acd6566f9e"} Mar 10 16:01:19 crc kubenswrapper[4743]: I0310 16:01:19.270110 4743 generic.go:334] "Generic (PLEG): container finished" podID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerID="88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d" exitCode=0 Mar 10 16:01:19 crc kubenswrapper[4743]: I0310 16:01:19.270155 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lt5r" event={"ID":"00b92d71-1e54-499a-a08c-a79dc2dc8b23","Type":"ContainerDied","Data":"88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d"} Mar 10 16:01:21 crc kubenswrapper[4743]: I0310 16:01:21.295867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lt5r" event={"ID":"00b92d71-1e54-499a-a08c-a79dc2dc8b23","Type":"ContainerStarted","Data":"754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce"} Mar 10 16:01:23 crc kubenswrapper[4743]: I0310 16:01:23.315838 4743 generic.go:334] "Generic (PLEG): container finished" podID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerID="754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce" exitCode=0 Mar 10 16:01:23 crc kubenswrapper[4743]: I0310 16:01:23.315925 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lt5r" event={"ID":"00b92d71-1e54-499a-a08c-a79dc2dc8b23","Type":"ContainerDied","Data":"754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce"} Mar 10 16:01:24 crc kubenswrapper[4743]: I0310 16:01:24.327880 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lt5r" event={"ID":"00b92d71-1e54-499a-a08c-a79dc2dc8b23","Type":"ContainerStarted","Data":"a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7"} Mar 10 16:01:24 crc kubenswrapper[4743]: I0310 16:01:24.352877 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4lt5r" podStartSLOduration=2.544206897 podStartE2EDuration="7.352857471s" podCreationTimestamp="2026-03-10 16:01:17 +0000 UTC" firstStartedPulling="2026-03-10 16:01:19.272249613 +0000 UTC m=+3343.979064361" lastFinishedPulling="2026-03-10 16:01:24.080900197 +0000 UTC m=+3348.787714935" observedRunningTime="2026-03-10 16:01:24.350282638 +0000 UTC m=+3349.057097386" watchObservedRunningTime="2026-03-10 16:01:24.352857471 +0000 UTC m=+3349.059672219" Mar 10 16:01:27 crc kubenswrapper[4743]: I0310 16:01:27.644028 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:27 crc kubenswrapper[4743]: I0310 16:01:27.644632 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:28 crc kubenswrapper[4743]: I0310 16:01:28.698686 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4lt5r" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="registry-server" probeResult="failure" output=< Mar 10 16:01:28 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 16:01:28 crc kubenswrapper[4743]: > Mar 10 16:01:37 crc kubenswrapper[4743]: I0310 16:01:37.702407 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:37 crc kubenswrapper[4743]: I0310 16:01:37.769356 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:37 crc kubenswrapper[4743]: I0310 16:01:37.950332 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4lt5r"] Mar 10 16:01:39 crc kubenswrapper[4743]: I0310 16:01:39.485501 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4lt5r" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="registry-server" containerID="cri-o://a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7" gracePeriod=2 Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.199881 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.363288 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-catalog-content\") pod \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.363539 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-utilities\") pod \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.363611 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-549nl\" (UniqueName: \"kubernetes.io/projected/00b92d71-1e54-499a-a08c-a79dc2dc8b23-kube-api-access-549nl\") pod \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\" (UID: \"00b92d71-1e54-499a-a08c-a79dc2dc8b23\") " Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.364878 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-utilities" (OuterVolumeSpecName: "utilities") pod "00b92d71-1e54-499a-a08c-a79dc2dc8b23" (UID: "00b92d71-1e54-499a-a08c-a79dc2dc8b23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.369462 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b92d71-1e54-499a-a08c-a79dc2dc8b23-kube-api-access-549nl" (OuterVolumeSpecName: "kube-api-access-549nl") pod "00b92d71-1e54-499a-a08c-a79dc2dc8b23" (UID: "00b92d71-1e54-499a-a08c-a79dc2dc8b23"). InnerVolumeSpecName "kube-api-access-549nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.445631 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00b92d71-1e54-499a-a08c-a79dc2dc8b23" (UID: "00b92d71-1e54-499a-a08c-a79dc2dc8b23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.466589 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.466627 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b92d71-1e54-499a-a08c-a79dc2dc8b23-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.466643 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-549nl\" (UniqueName: \"kubernetes.io/projected/00b92d71-1e54-499a-a08c-a79dc2dc8b23-kube-api-access-549nl\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.496774 4743 generic.go:334] "Generic (PLEG): container finished" podID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerID="a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7" exitCode=0 Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.496827 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lt5r" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.496835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lt5r" event={"ID":"00b92d71-1e54-499a-a08c-a79dc2dc8b23","Type":"ContainerDied","Data":"a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7"} Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.496865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lt5r" event={"ID":"00b92d71-1e54-499a-a08c-a79dc2dc8b23","Type":"ContainerDied","Data":"66cabfd4bb5ff6066c568156bb45d3a15fddffa5eb29127bc4e355acd6566f9e"} Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.496887 4743 scope.go:117] "RemoveContainer" containerID="a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.518570 4743 scope.go:117] "RemoveContainer" containerID="754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.548619 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4lt5r"] Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.552130 4743 scope.go:117] "RemoveContainer" containerID="88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.562347 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4lt5r"] Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.597906 4743 scope.go:117] "RemoveContainer" containerID="a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7" Mar 10 16:01:40 crc kubenswrapper[4743]: E0310 16:01:40.598310 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7\": container with ID starting with a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7 not found: ID does not exist" containerID="a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.598355 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7"} err="failed to get container status \"a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7\": rpc error: code = NotFound desc = could not find container \"a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7\": container with ID starting with a91b11d1235a1ddf9ed184960685bddb0b941da08c07a66228bba8532a602ff7 not found: ID does not exist" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.598382 4743 scope.go:117] "RemoveContainer" containerID="754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce" Mar 10 16:01:40 crc kubenswrapper[4743]: E0310 16:01:40.598737 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce\": container with ID starting with 754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce not found: ID does not exist" containerID="754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.598769 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce"} err="failed to get container status \"754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce\": rpc error: code = NotFound desc = could not find container \"754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce\": container with ID starting with 754cf00ecc95b48b22e60c3429b2864be17760d7244c5a820b598ee8c8aef1ce not found: ID does not exist" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.598786 4743 scope.go:117] "RemoveContainer" containerID="88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d" Mar 10 16:01:40 crc kubenswrapper[4743]: E0310 16:01:40.599225 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d\": container with ID starting with 88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d not found: ID does not exist" containerID="88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d" Mar 10 16:01:40 crc kubenswrapper[4743]: I0310 16:01:40.599248 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d"} err="failed to get container status \"88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d\": rpc error: code = NotFound desc = could not find container \"88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d\": container with ID starting with 88a767f3e27ce5a04d0db906b0bff03fed8e8f9b221efd5557eba9e6936fa00d not found: ID does not exist" Mar 10 16:01:41 crc kubenswrapper[4743]: I0310 16:01:41.928891 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" path="/var/lib/kubelet/pods/00b92d71-1e54-499a-a08c-a79dc2dc8b23/volumes" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.158511 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552642-zcb2v"] Mar 10 16:02:00 crc kubenswrapper[4743]: E0310 16:02:00.159479 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="registry-server" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.159495 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="registry-server" Mar 10 16:02:00 crc kubenswrapper[4743]: E0310 16:02:00.159518 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="extract-content" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.159524 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="extract-content" Mar 10 16:02:00 crc kubenswrapper[4743]: E0310 16:02:00.159551 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="extract-utilities" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.159558 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="extract-utilities" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.159801 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b92d71-1e54-499a-a08c-a79dc2dc8b23" containerName="registry-server" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.160489 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-zcb2v" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.162804 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.162993 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.163451 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.174649 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbrq8\" (UniqueName: \"kubernetes.io/projected/eceb61d4-6c63-4509-8343-644538020013-kube-api-access-hbrq8\") pod \"auto-csr-approver-29552642-zcb2v\" (UID: \"eceb61d4-6c63-4509-8343-644538020013\") " pod="openshift-infra/auto-csr-approver-29552642-zcb2v" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.176104 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-zcb2v"] Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.276607 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbrq8\" (UniqueName: \"kubernetes.io/projected/eceb61d4-6c63-4509-8343-644538020013-kube-api-access-hbrq8\") pod \"auto-csr-approver-29552642-zcb2v\" (UID: \"eceb61d4-6c63-4509-8343-644538020013\") " pod="openshift-infra/auto-csr-approver-29552642-zcb2v" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.301111 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbrq8\" (UniqueName: \"kubernetes.io/projected/eceb61d4-6c63-4509-8343-644538020013-kube-api-access-hbrq8\") pod \"auto-csr-approver-29552642-zcb2v\" (UID: \"eceb61d4-6c63-4509-8343-644538020013\") " pod="openshift-infra/auto-csr-approver-29552642-zcb2v" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.483678 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-zcb2v" Mar 10 16:02:00 crc kubenswrapper[4743]: I0310 16:02:00.970136 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-zcb2v"] Mar 10 16:02:01 crc kubenswrapper[4743]: I0310 16:02:01.719306 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-zcb2v" event={"ID":"eceb61d4-6c63-4509-8343-644538020013","Type":"ContainerStarted","Data":"f469898c3d2c5a7a92e18ffad89e9f8ff38b4101462e4f4c83322146378a3d05"} Mar 10 16:02:02 crc kubenswrapper[4743]: I0310 16:02:02.730103 4743 generic.go:334] "Generic (PLEG): container finished" podID="eceb61d4-6c63-4509-8343-644538020013" containerID="34cd8a3e8a0b6bc0d9a2ca9885018e376b93bfe189c7619ac9b645d21ca70446" exitCode=0 Mar 10 16:02:02 crc kubenswrapper[4743]: I0310 16:02:02.730267 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-zcb2v" event={"ID":"eceb61d4-6c63-4509-8343-644538020013","Type":"ContainerDied","Data":"34cd8a3e8a0b6bc0d9a2ca9885018e376b93bfe189c7619ac9b645d21ca70446"} Mar 10 16:02:04 crc kubenswrapper[4743]: I0310 16:02:04.438753 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-zcb2v" Mar 10 16:02:04 crc kubenswrapper[4743]: I0310 16:02:04.468977 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbrq8\" (UniqueName: \"kubernetes.io/projected/eceb61d4-6c63-4509-8343-644538020013-kube-api-access-hbrq8\") pod \"eceb61d4-6c63-4509-8343-644538020013\" (UID: \"eceb61d4-6c63-4509-8343-644538020013\") " Mar 10 16:02:04 crc kubenswrapper[4743]: I0310 16:02:04.474274 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eceb61d4-6c63-4509-8343-644538020013-kube-api-access-hbrq8" (OuterVolumeSpecName: "kube-api-access-hbrq8") pod "eceb61d4-6c63-4509-8343-644538020013" (UID: "eceb61d4-6c63-4509-8343-644538020013"). InnerVolumeSpecName "kube-api-access-hbrq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:02:04 crc kubenswrapper[4743]: I0310 16:02:04.571742 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbrq8\" (UniqueName: \"kubernetes.io/projected/eceb61d4-6c63-4509-8343-644538020013-kube-api-access-hbrq8\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:04 crc kubenswrapper[4743]: I0310 16:02:04.746419 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-zcb2v" event={"ID":"eceb61d4-6c63-4509-8343-644538020013","Type":"ContainerDied","Data":"f469898c3d2c5a7a92e18ffad89e9f8ff38b4101462e4f4c83322146378a3d05"} Mar 10 16:02:04 crc kubenswrapper[4743]: I0310 16:02:04.746721 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f469898c3d2c5a7a92e18ffad89e9f8ff38b4101462e4f4c83322146378a3d05" Mar 10 16:02:04 crc kubenswrapper[4743]: I0310 16:02:04.746493 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-zcb2v" Mar 10 16:02:05 crc kubenswrapper[4743]: I0310 16:02:05.526593 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-72t2c"] Mar 10 16:02:05 crc kubenswrapper[4743]: I0310 16:02:05.538032 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-72t2c"] Mar 10 16:02:05 crc kubenswrapper[4743]: I0310 16:02:05.925714 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def90d0b-f33b-4496-b98c-ebe7cfcf9e9f" path="/var/lib/kubelet/pods/def90d0b-f33b-4496-b98c-ebe7cfcf9e9f/volumes" Mar 10 16:02:15 crc kubenswrapper[4743]: I0310 16:02:15.383792 4743 scope.go:117] "RemoveContainer" containerID="20669d201fefe4e6d486ac5cefb27b65aa1b98b0081a5d9611138b653a64a856" Mar 10 16:03:11 crc kubenswrapper[4743]: I0310 16:03:11.252877 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:03:11 crc kubenswrapper[4743]: I0310 16:03:11.253378 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:03:20 crc kubenswrapper[4743]: I0310 16:03:20.892071 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2rxpp"] Mar 10 16:03:20 crc kubenswrapper[4743]: E0310 16:03:20.893086 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eceb61d4-6c63-4509-8343-644538020013" containerName="oc" Mar 10 16:03:20 crc kubenswrapper[4743]: I0310 16:03:20.893100 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eceb61d4-6c63-4509-8343-644538020013" containerName="oc" Mar 10 16:03:20 crc kubenswrapper[4743]: I0310 16:03:20.893334 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eceb61d4-6c63-4509-8343-644538020013" containerName="oc" Mar 10 16:03:20 crc kubenswrapper[4743]: I0310 16:03:20.894630 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:20 crc kubenswrapper[4743]: I0310 16:03:20.913589 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2rxpp"] Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.019875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh4nm\" (UniqueName: \"kubernetes.io/projected/c7b4c568-a0b4-41e6-9f7f-741204512592-kube-api-access-dh4nm\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.020010 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-utilities\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.020740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-catalog-content\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.122546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-utilities\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.122667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-catalog-content\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.122731 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh4nm\" (UniqueName: \"kubernetes.io/projected/c7b4c568-a0b4-41e6-9f7f-741204512592-kube-api-access-dh4nm\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.123488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-utilities\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.123573 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-catalog-content\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.143866 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh4nm\" (UniqueName: \"kubernetes.io/projected/c7b4c568-a0b4-41e6-9f7f-741204512592-kube-api-access-dh4nm\") pod \"redhat-operators-2rxpp\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.257190 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:21 crc kubenswrapper[4743]: I0310 16:03:21.790055 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2rxpp"] Mar 10 16:03:22 crc kubenswrapper[4743]: I0310 16:03:22.470352 4743 generic.go:334] "Generic (PLEG): container finished" podID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerID="a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6" exitCode=0 Mar 10 16:03:22 crc kubenswrapper[4743]: I0310 16:03:22.470388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rxpp" event={"ID":"c7b4c568-a0b4-41e6-9f7f-741204512592","Type":"ContainerDied","Data":"a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6"} Mar 10 16:03:22 crc kubenswrapper[4743]: I0310 16:03:22.470635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rxpp" event={"ID":"c7b4c568-a0b4-41e6-9f7f-741204512592","Type":"ContainerStarted","Data":"bd7889b3758691a328addca0766b3bfe28468a43f61cfcf065bdefa9c55395c1"} Mar 10 16:03:24 crc kubenswrapper[4743]: I0310 16:03:24.497380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rxpp" event={"ID":"c7b4c568-a0b4-41e6-9f7f-741204512592","Type":"ContainerStarted","Data":"8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a"} Mar 10 16:03:29 crc kubenswrapper[4743]: I0310 16:03:29.541234 4743 generic.go:334] "Generic (PLEG): container finished" podID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerID="8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a" exitCode=0 Mar 10 16:03:29 crc kubenswrapper[4743]: I0310 16:03:29.541756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rxpp" event={"ID":"c7b4c568-a0b4-41e6-9f7f-741204512592","Type":"ContainerDied","Data":"8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a"} Mar 10 16:03:30 crc kubenswrapper[4743]: I0310 16:03:30.554794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rxpp" event={"ID":"c7b4c568-a0b4-41e6-9f7f-741204512592","Type":"ContainerStarted","Data":"edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba"} Mar 10 16:03:30 crc kubenswrapper[4743]: I0310 16:03:30.617298 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2rxpp" podStartSLOduration=3.146020349 podStartE2EDuration="10.617260722s" podCreationTimestamp="2026-03-10 16:03:20 +0000 UTC" firstStartedPulling="2026-03-10 16:03:22.471941366 +0000 UTC m=+3467.178756114" lastFinishedPulling="2026-03-10 16:03:29.943181739 +0000 UTC m=+3474.649996487" observedRunningTime="2026-03-10 16:03:30.616299116 +0000 UTC m=+3475.323113864" watchObservedRunningTime="2026-03-10 16:03:30.617260722 +0000 UTC m=+3475.324075490" Mar 10 16:03:31 crc kubenswrapper[4743]: I0310 16:03:31.257919 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:31 crc kubenswrapper[4743]: I0310 16:03:31.258535 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:03:32 crc kubenswrapper[4743]: I0310 16:03:32.462526 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2rxpp" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="registry-server" probeResult="failure" output=< Mar 10 16:03:32 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 16:03:32 crc kubenswrapper[4743]: > Mar 10 16:03:41 crc kubenswrapper[4743]: I0310 16:03:41.252481 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:03:41 crc kubenswrapper[4743]: I0310 16:03:41.253060 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:03:42 crc kubenswrapper[4743]: I0310 16:03:42.305411 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2rxpp" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="registry-server" probeResult="failure" output=< Mar 10 16:03:42 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 16:03:42 crc kubenswrapper[4743]: > Mar 10 16:03:52 crc kubenswrapper[4743]: I0310 16:03:52.313481 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2rxpp" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="registry-server" probeResult="failure" output=< Mar 10 16:03:52 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 16:03:52 crc kubenswrapper[4743]: > Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.153410 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552644-pdw4b"] Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.155668 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-pdw4b" Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.158072 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.158398 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.158910 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.168479 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-pdw4b"] Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.248525 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvmrr\" (UniqueName: \"kubernetes.io/projected/cceb81a2-a8b4-418b-9abd-31ea65f24354-kube-api-access-nvmrr\") pod \"auto-csr-approver-29552644-pdw4b\" (UID: \"cceb81a2-a8b4-418b-9abd-31ea65f24354\") " pod="openshift-infra/auto-csr-approver-29552644-pdw4b" Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.350165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmrr\" (UniqueName: \"kubernetes.io/projected/cceb81a2-a8b4-418b-9abd-31ea65f24354-kube-api-access-nvmrr\") pod \"auto-csr-approver-29552644-pdw4b\" (UID: \"cceb81a2-a8b4-418b-9abd-31ea65f24354\") " pod="openshift-infra/auto-csr-approver-29552644-pdw4b" Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.372505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvmrr\" (UniqueName: \"kubernetes.io/projected/cceb81a2-a8b4-418b-9abd-31ea65f24354-kube-api-access-nvmrr\") pod \"auto-csr-approver-29552644-pdw4b\" (UID: \"cceb81a2-a8b4-418b-9abd-31ea65f24354\") " pod="openshift-infra/auto-csr-approver-29552644-pdw4b" Mar 10 16:04:00 crc kubenswrapper[4743]: I0310 16:04:00.478921 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-pdw4b" Mar 10 16:04:01 crc kubenswrapper[4743]: I0310 16:04:01.002579 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-pdw4b"] Mar 10 16:04:01 crc kubenswrapper[4743]: I0310 16:04:01.312156 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:04:01 crc kubenswrapper[4743]: I0310 16:04:01.372974 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:04:01 crc kubenswrapper[4743]: I0310 16:04:01.554717 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2rxpp"] Mar 10 16:04:01 crc kubenswrapper[4743]: I0310 16:04:01.848313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-pdw4b" event={"ID":"cceb81a2-a8b4-418b-9abd-31ea65f24354","Type":"ContainerStarted","Data":"15ed8802ee1d3adde6cc51cb2f18b5b40c61583f1d90db331005bf7c0dda31de"} Mar 10 16:04:02 crc kubenswrapper[4743]: I0310 16:04:02.857026 4743 generic.go:334] "Generic (PLEG): container finished" podID="cceb81a2-a8b4-418b-9abd-31ea65f24354" containerID="54ed7b5e77de7ed5fe02e6bb9168d0116d1742b0b96abf7c7d0a632b32ee6fc2" exitCode=0 Mar 10 16:04:02 crc kubenswrapper[4743]: I0310 16:04:02.857438 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2rxpp" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="registry-server" containerID="cri-o://edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba" gracePeriod=2 Mar 10 16:04:02 crc kubenswrapper[4743]: I0310 16:04:02.857729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-pdw4b" event={"ID":"cceb81a2-a8b4-418b-9abd-31ea65f24354","Type":"ContainerDied","Data":"54ed7b5e77de7ed5fe02e6bb9168d0116d1742b0b96abf7c7d0a632b32ee6fc2"} Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.585630 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.718046 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-utilities\") pod \"c7b4c568-a0b4-41e6-9f7f-741204512592\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.718129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-catalog-content\") pod \"c7b4c568-a0b4-41e6-9f7f-741204512592\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.718354 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh4nm\" (UniqueName: \"kubernetes.io/projected/c7b4c568-a0b4-41e6-9f7f-741204512592-kube-api-access-dh4nm\") pod \"c7b4c568-a0b4-41e6-9f7f-741204512592\" (UID: \"c7b4c568-a0b4-41e6-9f7f-741204512592\") " Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.719269 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-utilities" (OuterVolumeSpecName: "utilities") pod "c7b4c568-a0b4-41e6-9f7f-741204512592" (UID: "c7b4c568-a0b4-41e6-9f7f-741204512592"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.737759 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b4c568-a0b4-41e6-9f7f-741204512592-kube-api-access-dh4nm" (OuterVolumeSpecName: "kube-api-access-dh4nm") pod "c7b4c568-a0b4-41e6-9f7f-741204512592" (UID: "c7b4c568-a0b4-41e6-9f7f-741204512592"). InnerVolumeSpecName "kube-api-access-dh4nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.823840 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.823884 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh4nm\" (UniqueName: \"kubernetes.io/projected/c7b4c568-a0b4-41e6-9f7f-741204512592-kube-api-access-dh4nm\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.871796 4743 generic.go:334] "Generic (PLEG): container finished" podID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerID="edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba" exitCode=0 Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.872027 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2rxpp" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.872065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rxpp" event={"ID":"c7b4c568-a0b4-41e6-9f7f-741204512592","Type":"ContainerDied","Data":"edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba"} Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.872114 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rxpp" event={"ID":"c7b4c568-a0b4-41e6-9f7f-741204512592","Type":"ContainerDied","Data":"bd7889b3758691a328addca0766b3bfe28468a43f61cfcf065bdefa9c55395c1"} Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.872132 4743 scope.go:117] "RemoveContainer" containerID="edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.905246 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7b4c568-a0b4-41e6-9f7f-741204512592" (UID: "c7b4c568-a0b4-41e6-9f7f-741204512592"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.918220 4743 scope.go:117] "RemoveContainer" containerID="8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a" Mar 10 16:04:03 crc kubenswrapper[4743]: I0310 16:04:03.925477 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7b4c568-a0b4-41e6-9f7f-741204512592-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.009992 4743 scope.go:117] "RemoveContainer" containerID="a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.095282 4743 scope.go:117] "RemoveContainer" containerID="edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba" Mar 10 16:04:04 crc kubenswrapper[4743]: E0310 16:04:04.097357 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba\": container with ID starting with edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba not found: ID does not exist" containerID="edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.097422 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba"} err="failed to get container status \"edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba\": rpc error: code = NotFound desc = could not find container \"edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba\": container with ID starting with edbc9dea4bec6f31f4c50cb0dcf2d878f57f82f69a7474fe94f92ba1d050e0ba not found: ID does not exist" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.097477 4743 scope.go:117] "RemoveContainer" containerID="8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a" Mar 10 16:04:04 crc kubenswrapper[4743]: E0310 16:04:04.098127 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a\": container with ID starting with 8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a not found: ID does not exist" containerID="8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.098162 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a"} err="failed to get container status \"8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a\": rpc error: code = NotFound desc = could not find container \"8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a\": container with ID starting with 8d7707ea243dcb1802b86a282a69b4ac1d0decb750c54f1c251c420a99c0833a not found: ID does not exist" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.098177 4743 scope.go:117] "RemoveContainer" containerID="a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6" Mar 10 16:04:04 crc kubenswrapper[4743]: E0310 16:04:04.098478 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6\": container with ID starting with a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6 not found: ID does not exist" containerID="a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.098502 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6"} err="failed to get container status \"a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6\": rpc error: code = NotFound desc = could not find container \"a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6\": container with ID starting with a796a128e38631196648f0c2d449cf9b48e73a68cae0b5a0f414b67045c411b6 not found: ID does not exist" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.210372 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2rxpp"] Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.250768 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2rxpp"] Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.457006 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-pdw4b" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.643734 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvmrr\" (UniqueName: \"kubernetes.io/projected/cceb81a2-a8b4-418b-9abd-31ea65f24354-kube-api-access-nvmrr\") pod \"cceb81a2-a8b4-418b-9abd-31ea65f24354\" (UID: \"cceb81a2-a8b4-418b-9abd-31ea65f24354\") " Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.652590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cceb81a2-a8b4-418b-9abd-31ea65f24354-kube-api-access-nvmrr" (OuterVolumeSpecName: "kube-api-access-nvmrr") pod "cceb81a2-a8b4-418b-9abd-31ea65f24354" (UID: "cceb81a2-a8b4-418b-9abd-31ea65f24354"). InnerVolumeSpecName "kube-api-access-nvmrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.746282 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvmrr\" (UniqueName: \"kubernetes.io/projected/cceb81a2-a8b4-418b-9abd-31ea65f24354-kube-api-access-nvmrr\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.898530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-pdw4b" event={"ID":"cceb81a2-a8b4-418b-9abd-31ea65f24354","Type":"ContainerDied","Data":"15ed8802ee1d3adde6cc51cb2f18b5b40c61583f1d90db331005bf7c0dda31de"} Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.898581 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15ed8802ee1d3adde6cc51cb2f18b5b40c61583f1d90db331005bf7c0dda31de" Mar 10 16:04:04 crc kubenswrapper[4743]: I0310 16:04:04.898630 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-pdw4b" Mar 10 16:04:05 crc kubenswrapper[4743]: I0310 16:04:05.556268 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-rczk6"] Mar 10 16:04:05 crc kubenswrapper[4743]: I0310 16:04:05.564163 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-rczk6"] Mar 10 16:04:05 crc kubenswrapper[4743]: I0310 16:04:05.930255 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6083869d-3374-4da7-bac1-1ee28af64e17" path="/var/lib/kubelet/pods/6083869d-3374-4da7-bac1-1ee28af64e17/volumes" Mar 10 16:04:05 crc kubenswrapper[4743]: I0310 16:04:05.931372 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" path="/var/lib/kubelet/pods/c7b4c568-a0b4-41e6-9f7f-741204512592/volumes" Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.252502 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.253182 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.253267 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.254180 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"405bc24361e3bca0128de2227b390cf21c2c732269df1c9883fdd1a30a8f8e7a"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.254236 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://405bc24361e3bca0128de2227b390cf21c2c732269df1c9883fdd1a30a8f8e7a" gracePeriod=600 Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.963844 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="405bc24361e3bca0128de2227b390cf21c2c732269df1c9883fdd1a30a8f8e7a" exitCode=0 Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.963892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"405bc24361e3bca0128de2227b390cf21c2c732269df1c9883fdd1a30a8f8e7a"} Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.964375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04"} Mar 10 16:04:11 crc kubenswrapper[4743]: I0310 16:04:11.964401 4743 scope.go:117] "RemoveContainer" containerID="0bd9e223f9a4485612ca7347441288cb98dff0956d7c7c44fb6881b6c005d90e" Mar 10 16:04:15 crc kubenswrapper[4743]: I0310 16:04:15.524615 4743 scope.go:117] "RemoveContainer" containerID="7af1b6894f231304931c4a7efc75739f9240d79991194d664beefa9289d8ed6d" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.410778 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ngmj6"] Mar 10 16:05:16 crc kubenswrapper[4743]: E0310 16:05:16.411797 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="extract-content" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.411826 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="extract-content" Mar 10 16:05:16 crc kubenswrapper[4743]: E0310 16:05:16.411856 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="extract-utilities" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.411863 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="extract-utilities" Mar 10 16:05:16 crc kubenswrapper[4743]: E0310 16:05:16.411883 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="registry-server" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.411891 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="registry-server" Mar 10 16:05:16 crc kubenswrapper[4743]: E0310 16:05:16.411902 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cceb81a2-a8b4-418b-9abd-31ea65f24354" containerName="oc" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.411907 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cceb81a2-a8b4-418b-9abd-31ea65f24354" containerName="oc" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.412077 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cceb81a2-a8b4-418b-9abd-31ea65f24354" containerName="oc" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.412105 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b4c568-a0b4-41e6-9f7f-741204512592" containerName="registry-server" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.413458 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.426726 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngmj6"] Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.466882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-catalog-content\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.466939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6726p\" (UniqueName: \"kubernetes.io/projected/6dfbf18d-e483-476f-9b01-8a56d2779fc6-kube-api-access-6726p\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.467026 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-utilities\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.568359 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-catalog-content\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.568654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6726p\" (UniqueName: \"kubernetes.io/projected/6dfbf18d-e483-476f-9b01-8a56d2779fc6-kube-api-access-6726p\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.568723 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-utilities\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.569230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-utilities\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.569437 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-catalog-content\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.591615 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6726p\" (UniqueName: \"kubernetes.io/projected/6dfbf18d-e483-476f-9b01-8a56d2779fc6-kube-api-access-6726p\") pod \"community-operators-ngmj6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:16 crc kubenswrapper[4743]: I0310 16:05:16.737758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:17 crc kubenswrapper[4743]: I0310 16:05:17.318324 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngmj6"] Mar 10 16:05:17 crc kubenswrapper[4743]: I0310 16:05:17.587327 4743 generic.go:334] "Generic (PLEG): container finished" podID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerID="d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5" exitCode=0 Mar 10 16:05:17 crc kubenswrapper[4743]: I0310 16:05:17.587401 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngmj6" event={"ID":"6dfbf18d-e483-476f-9b01-8a56d2779fc6","Type":"ContainerDied","Data":"d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5"} Mar 10 16:05:17 crc kubenswrapper[4743]: I0310 16:05:17.587550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngmj6" event={"ID":"6dfbf18d-e483-476f-9b01-8a56d2779fc6","Type":"ContainerStarted","Data":"cef06fb7ba7cfcca8315671d1fd8a3bb4edd57fdc7493a5d8eb04e82b5ac88be"} Mar 10 16:05:17 crc kubenswrapper[4743]: I0310 16:05:17.589387 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:05:18 crc kubenswrapper[4743]: I0310 16:05:18.600101 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngmj6" event={"ID":"6dfbf18d-e483-476f-9b01-8a56d2779fc6","Type":"ContainerStarted","Data":"2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9"} Mar 10 16:05:20 crc kubenswrapper[4743]: I0310 16:05:20.620020 4743 generic.go:334] "Generic (PLEG): container finished" podID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerID="2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9" exitCode=0 Mar 10 16:05:20 crc kubenswrapper[4743]: I0310 16:05:20.620106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngmj6" event={"ID":"6dfbf18d-e483-476f-9b01-8a56d2779fc6","Type":"ContainerDied","Data":"2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9"} Mar 10 16:05:21 crc kubenswrapper[4743]: I0310 16:05:21.631489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngmj6" event={"ID":"6dfbf18d-e483-476f-9b01-8a56d2779fc6","Type":"ContainerStarted","Data":"054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927"} Mar 10 16:05:21 crc kubenswrapper[4743]: I0310 16:05:21.651019 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ngmj6" podStartSLOduration=2.180606133 podStartE2EDuration="5.650997434s" podCreationTimestamp="2026-03-10 16:05:16 +0000 UTC" firstStartedPulling="2026-03-10 16:05:17.589144097 +0000 UTC m=+3582.295958845" lastFinishedPulling="2026-03-10 16:05:21.059535398 +0000 UTC m=+3585.766350146" observedRunningTime="2026-03-10 16:05:21.649572326 +0000 UTC m=+3586.356387084" watchObservedRunningTime="2026-03-10 16:05:21.650997434 +0000 UTC m=+3586.357812182" Mar 10 16:05:26 crc kubenswrapper[4743]: I0310 16:05:26.738537 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:26 crc kubenswrapper[4743]: I0310 16:05:26.739119 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:26 crc kubenswrapper[4743]: I0310 16:05:26.791166 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:27 crc kubenswrapper[4743]: I0310 16:05:27.755631 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:27 crc kubenswrapper[4743]: I0310 16:05:27.813358 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngmj6"] Mar 10 16:05:29 crc kubenswrapper[4743]: I0310 16:05:29.716672 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ngmj6" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerName="registry-server" containerID="cri-o://054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927" gracePeriod=2 Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.484859 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.651027 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6726p\" (UniqueName: \"kubernetes.io/projected/6dfbf18d-e483-476f-9b01-8a56d2779fc6-kube-api-access-6726p\") pod \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.651087 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-utilities\") pod \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.651138 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-catalog-content\") pod \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\" (UID: \"6dfbf18d-e483-476f-9b01-8a56d2779fc6\") " Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.651933 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-utilities" (OuterVolumeSpecName: "utilities") pod "6dfbf18d-e483-476f-9b01-8a56d2779fc6" (UID: "6dfbf18d-e483-476f-9b01-8a56d2779fc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.652018 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.658059 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfbf18d-e483-476f-9b01-8a56d2779fc6-kube-api-access-6726p" (OuterVolumeSpecName: "kube-api-access-6726p") pod "6dfbf18d-e483-476f-9b01-8a56d2779fc6" (UID: "6dfbf18d-e483-476f-9b01-8a56d2779fc6"). InnerVolumeSpecName "kube-api-access-6726p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.702706 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dfbf18d-e483-476f-9b01-8a56d2779fc6" (UID: "6dfbf18d-e483-476f-9b01-8a56d2779fc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.732331 4743 generic.go:334] "Generic (PLEG): container finished" podID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerID="054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927" exitCode=0 Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.732366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngmj6" event={"ID":"6dfbf18d-e483-476f-9b01-8a56d2779fc6","Type":"ContainerDied","Data":"054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927"} Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.732431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngmj6" event={"ID":"6dfbf18d-e483-476f-9b01-8a56d2779fc6","Type":"ContainerDied","Data":"cef06fb7ba7cfcca8315671d1fd8a3bb4edd57fdc7493a5d8eb04e82b5ac88be"} Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.732452 4743 scope.go:117] "RemoveContainer" containerID="054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.732451 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngmj6" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.754163 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbf18d-e483-476f-9b01-8a56d2779fc6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.754208 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6726p\" (UniqueName: \"kubernetes.io/projected/6dfbf18d-e483-476f-9b01-8a56d2779fc6-kube-api-access-6726p\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.775246 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngmj6"] Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.786971 4743 scope.go:117] "RemoveContainer" containerID="2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.787115 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ngmj6"] Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.814266 4743 scope.go:117] "RemoveContainer" containerID="d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.887880 4743 scope.go:117] "RemoveContainer" containerID="054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927" Mar 10 16:05:30 crc kubenswrapper[4743]: E0310 16:05:30.888558 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927\": container with ID starting with 054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927 not found: ID does not exist" containerID="054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.888596 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927"} err="failed to get container status \"054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927\": rpc error: code = NotFound desc = could not find container \"054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927\": container with ID starting with 054ad408f36ad83db750c92ff608492b6aa9facb79383e2b3c9fd9db37e01927 not found: ID does not exist" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.888623 4743 scope.go:117] "RemoveContainer" containerID="2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9" Mar 10 16:05:30 crc kubenswrapper[4743]: E0310 16:05:30.889078 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9\": container with ID starting with 2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9 not found: ID does not exist" containerID="2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.889127 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9"} err="failed to get container status \"2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9\": rpc error: code = NotFound desc = could not find container \"2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9\": container with ID starting with 2dcb7af974343c4fa01726f3c1d2e7dde10c88c41f8141f629e5dbdaeb48e1d9 not found: ID does not exist" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.889159 4743 scope.go:117] "RemoveContainer" containerID="d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5" Mar 10 16:05:30 crc kubenswrapper[4743]: E0310 16:05:30.889564 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5\": container with ID starting with d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5 not found: ID does not exist" containerID="d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5" Mar 10 16:05:30 crc kubenswrapper[4743]: I0310 16:05:30.889654 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5"} err="failed to get container status \"d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5\": rpc error: code = NotFound desc = could not find container \"d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5\": container with ID starting with d7d63d095a7ffb3c2959cfd4945b64cd767b8caf93d69a5a0c6f33c1ecd1f7e5 not found: ID does not exist" Mar 10 16:05:31 crc kubenswrapper[4743]: I0310 16:05:31.928674 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" path="/var/lib/kubelet/pods/6dfbf18d-e483-476f-9b01-8a56d2779fc6/volumes" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.146854 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552646-5vsd4"] Mar 10 16:06:00 crc kubenswrapper[4743]: E0310 16:06:00.158778 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerName="extract-utilities" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.158841 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerName="extract-utilities" Mar 10 16:06:00 crc kubenswrapper[4743]: E0310 16:06:00.158870 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerName="extract-content" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.158881 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerName="extract-content" Mar 10 16:06:00 crc kubenswrapper[4743]: E0310 16:06:00.158901 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerName="registry-server" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.158909 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerName="registry-server" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.159239 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfbf18d-e483-476f-9b01-8a56d2779fc6" containerName="registry-server" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.159900 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-5vsd4"] Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.160000 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-5vsd4" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.161618 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.161665 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.162326 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.263366 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4ln2\" (UniqueName: \"kubernetes.io/projected/f2397075-b0df-4433-9997-f2fb1cb0db0e-kube-api-access-w4ln2\") pod \"auto-csr-approver-29552646-5vsd4\" (UID: \"f2397075-b0df-4433-9997-f2fb1cb0db0e\") " pod="openshift-infra/auto-csr-approver-29552646-5vsd4" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.365143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ln2\" (UniqueName: \"kubernetes.io/projected/f2397075-b0df-4433-9997-f2fb1cb0db0e-kube-api-access-w4ln2\") pod \"auto-csr-approver-29552646-5vsd4\" (UID: \"f2397075-b0df-4433-9997-f2fb1cb0db0e\") " pod="openshift-infra/auto-csr-approver-29552646-5vsd4" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.385689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ln2\" (UniqueName: \"kubernetes.io/projected/f2397075-b0df-4433-9997-f2fb1cb0db0e-kube-api-access-w4ln2\") pod \"auto-csr-approver-29552646-5vsd4\" (UID: \"f2397075-b0df-4433-9997-f2fb1cb0db0e\") " pod="openshift-infra/auto-csr-approver-29552646-5vsd4" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.486929 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-5vsd4" Mar 10 16:06:00 crc kubenswrapper[4743]: I0310 16:06:00.978097 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-5vsd4"] Mar 10 16:06:01 crc kubenswrapper[4743]: I0310 16:06:01.010845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-5vsd4" event={"ID":"f2397075-b0df-4433-9997-f2fb1cb0db0e","Type":"ContainerStarted","Data":"2b1d81ecf09b78a6462de8e2244c19d34754c1ad06718a596401997318924bab"} Mar 10 16:06:04 crc kubenswrapper[4743]: I0310 16:06:04.070422 4743 generic.go:334] "Generic (PLEG): container finished" podID="f2397075-b0df-4433-9997-f2fb1cb0db0e" containerID="3c34cdafaf2bec0bf4a77f7b3cd4289ea7cce33610e45169d34f7f16c5500c47" exitCode=0 Mar 10 16:06:04 crc kubenswrapper[4743]: I0310 16:06:04.070520 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-5vsd4" event={"ID":"f2397075-b0df-4433-9997-f2fb1cb0db0e","Type":"ContainerDied","Data":"3c34cdafaf2bec0bf4a77f7b3cd4289ea7cce33610e45169d34f7f16c5500c47"} Mar 10 16:06:05 crc kubenswrapper[4743]: I0310 16:06:05.708438 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-5vsd4" Mar 10 16:06:05 crc kubenswrapper[4743]: I0310 16:06:05.794989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4ln2\" (UniqueName: \"kubernetes.io/projected/f2397075-b0df-4433-9997-f2fb1cb0db0e-kube-api-access-w4ln2\") pod \"f2397075-b0df-4433-9997-f2fb1cb0db0e\" (UID: \"f2397075-b0df-4433-9997-f2fb1cb0db0e\") " Mar 10 16:06:05 crc kubenswrapper[4743]: I0310 16:06:05.803614 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2397075-b0df-4433-9997-f2fb1cb0db0e-kube-api-access-w4ln2" (OuterVolumeSpecName: "kube-api-access-w4ln2") pod "f2397075-b0df-4433-9997-f2fb1cb0db0e" (UID: "f2397075-b0df-4433-9997-f2fb1cb0db0e"). InnerVolumeSpecName "kube-api-access-w4ln2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:06:05 crc kubenswrapper[4743]: I0310 16:06:05.897926 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4ln2\" (UniqueName: \"kubernetes.io/projected/f2397075-b0df-4433-9997-f2fb1cb0db0e-kube-api-access-w4ln2\") on node \"crc\" DevicePath \"\"" Mar 10 16:06:06 crc kubenswrapper[4743]: I0310 16:06:06.089879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-5vsd4" event={"ID":"f2397075-b0df-4433-9997-f2fb1cb0db0e","Type":"ContainerDied","Data":"2b1d81ecf09b78a6462de8e2244c19d34754c1ad06718a596401997318924bab"} Mar 10 16:06:06 crc kubenswrapper[4743]: I0310 16:06:06.089934 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1d81ecf09b78a6462de8e2244c19d34754c1ad06718a596401997318924bab" Mar 10 16:06:06 crc kubenswrapper[4743]: I0310 16:06:06.089933 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-5vsd4" Mar 10 16:06:06 crc kubenswrapper[4743]: I0310 16:06:06.785935 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-5ncfn"] Mar 10 16:06:06 crc kubenswrapper[4743]: I0310 16:06:06.794054 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-5ncfn"] Mar 10 16:06:07 crc kubenswrapper[4743]: I0310 16:06:07.934138 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e8fd61-7218-4529-88c6-3567ad2d2756" path="/var/lib/kubelet/pods/59e8fd61-7218-4529-88c6-3567ad2d2756/volumes" Mar 10 16:06:11 crc kubenswrapper[4743]: I0310 16:06:11.252433 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:06:11 crc kubenswrapper[4743]: I0310 16:06:11.252782 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:06:15 crc kubenswrapper[4743]: I0310 16:06:15.643404 4743 scope.go:117] "RemoveContainer" containerID="ddf84644cad462c50115de2dad6f445c8c860663c3269ab2537303cf5020a383" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.425226 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-46bmt"] Mar 10 16:06:25 crc kubenswrapper[4743]: E0310 16:06:25.427528 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2397075-b0df-4433-9997-f2fb1cb0db0e" containerName="oc" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.427600 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2397075-b0df-4433-9997-f2fb1cb0db0e" containerName="oc" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.430177 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2397075-b0df-4433-9997-f2fb1cb0db0e" containerName="oc" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.431745 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.439348 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-46bmt"] Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.594495 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-catalog-content\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.594672 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-utilities\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.594762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvm8j\" (UniqueName: \"kubernetes.io/projected/225c839b-109f-46cf-b4f9-902a74aba655-kube-api-access-qvm8j\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.696288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-utilities\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.696918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvm8j\" (UniqueName: \"kubernetes.io/projected/225c839b-109f-46cf-b4f9-902a74aba655-kube-api-access-qvm8j\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.696938 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-utilities\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.697281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-catalog-content\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.697792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-catalog-content\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.727596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvm8j\" (UniqueName: \"kubernetes.io/projected/225c839b-109f-46cf-b4f9-902a74aba655-kube-api-access-qvm8j\") pod \"redhat-marketplace-46bmt\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:25 crc kubenswrapper[4743]: I0310 16:06:25.750485 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:26 crc kubenswrapper[4743]: I0310 16:06:26.322795 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-46bmt"] Mar 10 16:06:27 crc kubenswrapper[4743]: I0310 16:06:27.295541 4743 generic.go:334] "Generic (PLEG): container finished" podID="225c839b-109f-46cf-b4f9-902a74aba655" containerID="ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c" exitCode=0 Mar 10 16:06:27 crc kubenswrapper[4743]: I0310 16:06:27.295589 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46bmt" event={"ID":"225c839b-109f-46cf-b4f9-902a74aba655","Type":"ContainerDied","Data":"ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c"} Mar 10 16:06:27 crc kubenswrapper[4743]: I0310 16:06:27.295833 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46bmt" event={"ID":"225c839b-109f-46cf-b4f9-902a74aba655","Type":"ContainerStarted","Data":"2d364ef71b3cad5f377ec12c0dee3d63899d90c5fea667fd505e36c24e7d9d11"} Mar 10 16:06:28 crc kubenswrapper[4743]: I0310 16:06:28.305964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46bmt" event={"ID":"225c839b-109f-46cf-b4f9-902a74aba655","Type":"ContainerStarted","Data":"fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c"} Mar 10 16:06:30 crc kubenswrapper[4743]: I0310 16:06:30.331668 4743 generic.go:334] "Generic (PLEG): container finished" podID="225c839b-109f-46cf-b4f9-902a74aba655" containerID="fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c" exitCode=0 Mar 10 16:06:30 crc kubenswrapper[4743]: I0310 16:06:30.332318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46bmt" event={"ID":"225c839b-109f-46cf-b4f9-902a74aba655","Type":"ContainerDied","Data":"fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c"} Mar 10 16:06:31 crc kubenswrapper[4743]: I0310 16:06:31.345549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46bmt" event={"ID":"225c839b-109f-46cf-b4f9-902a74aba655","Type":"ContainerStarted","Data":"b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a"} Mar 10 16:06:35 crc kubenswrapper[4743]: I0310 16:06:35.750744 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:35 crc kubenswrapper[4743]: I0310 16:06:35.751326 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:35 crc kubenswrapper[4743]: I0310 16:06:35.811919 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:35 crc kubenswrapper[4743]: I0310 16:06:35.835705 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-46bmt" podStartSLOduration=7.122460625 podStartE2EDuration="10.835685913s" podCreationTimestamp="2026-03-10 16:06:25 +0000 UTC" firstStartedPulling="2026-03-10 16:06:27.30103089 +0000 UTC m=+3652.007845628" lastFinishedPulling="2026-03-10 16:06:31.014256168 +0000 UTC m=+3655.721070916" observedRunningTime="2026-03-10 16:06:31.366373931 +0000 UTC m=+3656.073188679" watchObservedRunningTime="2026-03-10 16:06:35.835685913 +0000 UTC m=+3660.542500661" Mar 10 16:06:36 crc kubenswrapper[4743]: I0310 16:06:36.472963 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:36 crc kubenswrapper[4743]: I0310 16:06:36.535007 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-46bmt"] Mar 10 16:06:38 crc kubenswrapper[4743]: I0310 16:06:38.433741 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-46bmt" podUID="225c839b-109f-46cf-b4f9-902a74aba655" containerName="registry-server" containerID="cri-o://b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a" gracePeriod=2 Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.089851 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.202201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvm8j\" (UniqueName: \"kubernetes.io/projected/225c839b-109f-46cf-b4f9-902a74aba655-kube-api-access-qvm8j\") pod \"225c839b-109f-46cf-b4f9-902a74aba655\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.202525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-utilities\") pod \"225c839b-109f-46cf-b4f9-902a74aba655\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.202594 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-catalog-content\") pod \"225c839b-109f-46cf-b4f9-902a74aba655\" (UID: \"225c839b-109f-46cf-b4f9-902a74aba655\") " Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.203474 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-utilities" (OuterVolumeSpecName: "utilities") pod "225c839b-109f-46cf-b4f9-902a74aba655" (UID: "225c839b-109f-46cf-b4f9-902a74aba655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.222122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225c839b-109f-46cf-b4f9-902a74aba655-kube-api-access-qvm8j" (OuterVolumeSpecName: "kube-api-access-qvm8j") pod "225c839b-109f-46cf-b4f9-902a74aba655" (UID: "225c839b-109f-46cf-b4f9-902a74aba655"). InnerVolumeSpecName "kube-api-access-qvm8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.240217 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "225c839b-109f-46cf-b4f9-902a74aba655" (UID: "225c839b-109f-46cf-b4f9-902a74aba655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.304977 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.305522 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225c839b-109f-46cf-b4f9-902a74aba655-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.305601 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvm8j\" (UniqueName: \"kubernetes.io/projected/225c839b-109f-46cf-b4f9-902a74aba655-kube-api-access-qvm8j\") on node \"crc\" DevicePath \"\"" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.446492 4743 generic.go:334] "Generic (PLEG): container finished" podID="225c839b-109f-46cf-b4f9-902a74aba655" containerID="b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a" exitCode=0 Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.446531 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46bmt" event={"ID":"225c839b-109f-46cf-b4f9-902a74aba655","Type":"ContainerDied","Data":"b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a"} Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.446557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46bmt" event={"ID":"225c839b-109f-46cf-b4f9-902a74aba655","Type":"ContainerDied","Data":"2d364ef71b3cad5f377ec12c0dee3d63899d90c5fea667fd505e36c24e7d9d11"} Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.446575 4743 scope.go:117] "RemoveContainer" containerID="b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.446696 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46bmt" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.488026 4743 scope.go:117] "RemoveContainer" containerID="fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.496847 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-46bmt"] Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.524403 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-46bmt"] Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.545329 4743 scope.go:117] "RemoveContainer" containerID="ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.632918 4743 scope.go:117] "RemoveContainer" containerID="b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a" Mar 10 16:06:39 crc kubenswrapper[4743]: E0310 16:06:39.636033 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a\": container with ID starting with b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a not found: ID does not exist" containerID="b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.636084 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a"} err="failed to get container status \"b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a\": rpc error: code = NotFound desc = could not find container \"b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a\": container with ID starting with b08ffa3cb17edd4678219549ef31aabc9cb399603cbe058469855af90d13fe5a not found: ID does not exist" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.636116 4743 scope.go:117] "RemoveContainer" containerID="fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c" Mar 10 16:06:39 crc kubenswrapper[4743]: E0310 16:06:39.636626 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c\": container with ID starting with fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c not found: ID does not exist" containerID="fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.636668 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c"} err="failed to get container status \"fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c\": rpc error: code = NotFound desc = could not find container \"fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c\": container with ID starting with fa5dfd59fc5c8fbe7fdf1980dffadcdee134785c468b40a3452fde85edbcb67c not found: ID does not exist" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.636682 4743 scope.go:117] "RemoveContainer" containerID="ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c" Mar 10 16:06:39 crc kubenswrapper[4743]: E0310 16:06:39.636991 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c\": container with ID starting with ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c not found: ID does not exist" containerID="ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.637016 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c"} err="failed to get container status \"ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c\": rpc error: code = NotFound desc = could not find container \"ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c\": container with ID starting with ce6848bde63edf8a9cd3eb9847453f8dd9088bc1fd5099936f9d5069314c832c not found: ID does not exist" Mar 10 16:06:39 crc kubenswrapper[4743]: I0310 16:06:39.931615 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225c839b-109f-46cf-b4f9-902a74aba655" path="/var/lib/kubelet/pods/225c839b-109f-46cf-b4f9-902a74aba655/volumes" Mar 10 16:06:41 crc kubenswrapper[4743]: I0310 16:06:41.252723 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:06:41 crc kubenswrapper[4743]: I0310 16:06:41.253330 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.252552 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.253314 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.253384 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.254477 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.254574 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" gracePeriod=600 Mar 10 16:07:11 crc kubenswrapper[4743]: E0310 16:07:11.436669 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.731005 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" exitCode=0 Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.731050 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04"} Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.731082 4743 scope.go:117] "RemoveContainer" containerID="405bc24361e3bca0128de2227b390cf21c2c732269df1c9883fdd1a30a8f8e7a" Mar 10 16:07:11 crc kubenswrapper[4743]: I0310 16:07:11.731860 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:07:11 crc kubenswrapper[4743]: E0310 16:07:11.732268 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:07:23 crc kubenswrapper[4743]: I0310 16:07:23.915916 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:07:23 crc kubenswrapper[4743]: E0310 16:07:23.916635 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:07:37 crc kubenswrapper[4743]: I0310 16:07:37.918548 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:07:37 crc kubenswrapper[4743]: E0310 16:07:37.919316 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:07:51 crc kubenswrapper[4743]: I0310 16:07:51.915977 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:07:51 crc kubenswrapper[4743]: E0310 16:07:51.916748 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.151900 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552648-b2stk"] Mar 10 16:08:00 crc kubenswrapper[4743]: E0310 16:08:00.153079 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225c839b-109f-46cf-b4f9-902a74aba655" containerName="extract-content" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.153101 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="225c839b-109f-46cf-b4f9-902a74aba655" containerName="extract-content" Mar 10 16:08:00 crc kubenswrapper[4743]: E0310 16:08:00.153131 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225c839b-109f-46cf-b4f9-902a74aba655" containerName="registry-server" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.153138 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="225c839b-109f-46cf-b4f9-902a74aba655" containerName="registry-server" Mar 10 16:08:00 crc kubenswrapper[4743]: E0310 16:08:00.153171 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225c839b-109f-46cf-b4f9-902a74aba655" containerName="extract-utilities" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.153180 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="225c839b-109f-46cf-b4f9-902a74aba655" containerName="extract-utilities" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.153422 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="225c839b-109f-46cf-b4f9-902a74aba655" containerName="registry-server" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.154346 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-b2stk" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.158176 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.158401 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.160712 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.164310 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-b2stk"] Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.269468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k286k\" (UniqueName: \"kubernetes.io/projected/12f4bd91-dabb-4e73-af48-3b6514b14b2d-kube-api-access-k286k\") pod \"auto-csr-approver-29552648-b2stk\" (UID: \"12f4bd91-dabb-4e73-af48-3b6514b14b2d\") " pod="openshift-infra/auto-csr-approver-29552648-b2stk" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.371422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k286k\" (UniqueName: \"kubernetes.io/projected/12f4bd91-dabb-4e73-af48-3b6514b14b2d-kube-api-access-k286k\") pod \"auto-csr-approver-29552648-b2stk\" (UID: \"12f4bd91-dabb-4e73-af48-3b6514b14b2d\") " pod="openshift-infra/auto-csr-approver-29552648-b2stk" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.396076 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k286k\" (UniqueName: \"kubernetes.io/projected/12f4bd91-dabb-4e73-af48-3b6514b14b2d-kube-api-access-k286k\") pod \"auto-csr-approver-29552648-b2stk\" (UID: \"12f4bd91-dabb-4e73-af48-3b6514b14b2d\") " pod="openshift-infra/auto-csr-approver-29552648-b2stk" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.479036 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-b2stk" Mar 10 16:08:00 crc kubenswrapper[4743]: I0310 16:08:00.958401 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-b2stk"] Mar 10 16:08:01 crc kubenswrapper[4743]: I0310 16:08:01.182760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-b2stk" event={"ID":"12f4bd91-dabb-4e73-af48-3b6514b14b2d","Type":"ContainerStarted","Data":"a42b5f4490e4aa53628dd241a5aa74549deadc3413267e228b97fd819444c664"} Mar 10 16:08:03 crc kubenswrapper[4743]: I0310 16:08:03.204660 4743 generic.go:334] "Generic (PLEG): container finished" podID="12f4bd91-dabb-4e73-af48-3b6514b14b2d" containerID="22b69878a7c8d41512f590c210f13c1ad0214a6cedc877671ccb27840d1666ae" exitCode=0 Mar 10 16:08:03 crc kubenswrapper[4743]: I0310 16:08:03.204758 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-b2stk" event={"ID":"12f4bd91-dabb-4e73-af48-3b6514b14b2d","Type":"ContainerDied","Data":"22b69878a7c8d41512f590c210f13c1ad0214a6cedc877671ccb27840d1666ae"} Mar 10 16:08:04 crc kubenswrapper[4743]: I0310 16:08:04.888218 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-b2stk" Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.083838 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k286k\" (UniqueName: \"kubernetes.io/projected/12f4bd91-dabb-4e73-af48-3b6514b14b2d-kube-api-access-k286k\") pod \"12f4bd91-dabb-4e73-af48-3b6514b14b2d\" (UID: \"12f4bd91-dabb-4e73-af48-3b6514b14b2d\") " Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.095907 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f4bd91-dabb-4e73-af48-3b6514b14b2d-kube-api-access-k286k" (OuterVolumeSpecName: "kube-api-access-k286k") pod "12f4bd91-dabb-4e73-af48-3b6514b14b2d" (UID: "12f4bd91-dabb-4e73-af48-3b6514b14b2d"). InnerVolumeSpecName "kube-api-access-k286k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.187143 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k286k\" (UniqueName: \"kubernetes.io/projected/12f4bd91-dabb-4e73-af48-3b6514b14b2d-kube-api-access-k286k\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.225661 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-b2stk" event={"ID":"12f4bd91-dabb-4e73-af48-3b6514b14b2d","Type":"ContainerDied","Data":"a42b5f4490e4aa53628dd241a5aa74549deadc3413267e228b97fd819444c664"} Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.225701 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42b5f4490e4aa53628dd241a5aa74549deadc3413267e228b97fd819444c664" Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.225722 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-b2stk" Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.923439 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:08:05 crc kubenswrapper[4743]: E0310 16:08:05.923954 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.974913 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-zcb2v"] Mar 10 16:08:05 crc kubenswrapper[4743]: I0310 16:08:05.984760 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-zcb2v"] Mar 10 16:08:07 crc kubenswrapper[4743]: I0310 16:08:07.931020 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eceb61d4-6c63-4509-8343-644538020013" path="/var/lib/kubelet/pods/eceb61d4-6c63-4509-8343-644538020013/volumes" Mar 10 16:08:15 crc kubenswrapper[4743]: I0310 16:08:15.819700 4743 scope.go:117] "RemoveContainer" containerID="34cd8a3e8a0b6bc0d9a2ca9885018e376b93bfe189c7619ac9b645d21ca70446" Mar 10 16:08:18 crc kubenswrapper[4743]: I0310 16:08:18.916802 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:08:18 crc kubenswrapper[4743]: E0310 16:08:18.917557 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:08:33 crc kubenswrapper[4743]: I0310 16:08:33.915892 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:08:33 crc kubenswrapper[4743]: E0310 16:08:33.917587 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:08:46 crc kubenswrapper[4743]: I0310 16:08:45.918069 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:08:46 crc kubenswrapper[4743]: E0310 16:08:45.921455 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:08:46 crc kubenswrapper[4743]: I0310 16:08:46.006269 4743 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 16:08:46 crc kubenswrapper[4743]: I0310 16:08:46.006428 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 16:09:00 crc kubenswrapper[4743]: I0310 16:09:00.915948 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:09:00 crc kubenswrapper[4743]: E0310 16:09:00.916861 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:09:14 crc kubenswrapper[4743]: I0310 16:09:14.914929 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:09:14 crc kubenswrapper[4743]: E0310 16:09:14.915694 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:09:28 crc kubenswrapper[4743]: I0310 16:09:28.915698 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:09:28 crc kubenswrapper[4743]: E0310 16:09:28.916650 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:09:39 crc kubenswrapper[4743]: I0310 16:09:39.916015 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:09:39 crc kubenswrapper[4743]: E0310 16:09:39.917133 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:09:54 crc kubenswrapper[4743]: I0310 16:09:54.916229 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:09:54 crc kubenswrapper[4743]: E0310 16:09:54.916860 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.159997 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552650-hdwfz"] Mar 10 16:10:00 crc kubenswrapper[4743]: E0310 16:10:00.160767 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f4bd91-dabb-4e73-af48-3b6514b14b2d" containerName="oc" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.160779 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f4bd91-dabb-4e73-af48-3b6514b14b2d" containerName="oc" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.161008 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f4bd91-dabb-4e73-af48-3b6514b14b2d" containerName="oc" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.161682 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.163886 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.164922 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.164999 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.217641 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-hdwfz"] Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.279645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt2pw\" (UniqueName: \"kubernetes.io/projected/db1fe291-64d9-432e-9203-bdbdfd4e7a5f-kube-api-access-nt2pw\") pod \"auto-csr-approver-29552650-hdwfz\" (UID: \"db1fe291-64d9-432e-9203-bdbdfd4e7a5f\") " pod="openshift-infra/auto-csr-approver-29552650-hdwfz" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.384101 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt2pw\" (UniqueName: \"kubernetes.io/projected/db1fe291-64d9-432e-9203-bdbdfd4e7a5f-kube-api-access-nt2pw\") pod \"auto-csr-approver-29552650-hdwfz\" (UID: \"db1fe291-64d9-432e-9203-bdbdfd4e7a5f\") " pod="openshift-infra/auto-csr-approver-29552650-hdwfz" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.412538 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt2pw\" (UniqueName: \"kubernetes.io/projected/db1fe291-64d9-432e-9203-bdbdfd4e7a5f-kube-api-access-nt2pw\") pod \"auto-csr-approver-29552650-hdwfz\" (UID: \"db1fe291-64d9-432e-9203-bdbdfd4e7a5f\") " pod="openshift-infra/auto-csr-approver-29552650-hdwfz" Mar 10 16:10:00 crc kubenswrapper[4743]: I0310 16:10:00.479869 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" Mar 10 16:10:01 crc kubenswrapper[4743]: I0310 16:10:01.007724 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-hdwfz"] Mar 10 16:10:01 crc kubenswrapper[4743]: I0310 16:10:01.132917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" event={"ID":"db1fe291-64d9-432e-9203-bdbdfd4e7a5f","Type":"ContainerStarted","Data":"9c61196901a5b096f6502ab6d0cca8b91b5640d7b13a00fd9cd125915d0c7c89"} Mar 10 16:10:03 crc kubenswrapper[4743]: I0310 16:10:03.152231 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" event={"ID":"db1fe291-64d9-432e-9203-bdbdfd4e7a5f","Type":"ContainerStarted","Data":"9181d5716c48e3dc26fd7e66efacbafd11ac9d8522a8a5e22e4a422bf00f13d5"} Mar 10 16:10:03 crc kubenswrapper[4743]: I0310 16:10:03.173227 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" podStartSLOduration=1.457917748 podStartE2EDuration="3.173208073s" podCreationTimestamp="2026-03-10 16:10:00 +0000 UTC" firstStartedPulling="2026-03-10 16:10:01.020395723 +0000 UTC m=+3865.727210471" lastFinishedPulling="2026-03-10 16:10:02.735686048 +0000 UTC m=+3867.442500796" observedRunningTime="2026-03-10 16:10:03.170204692 +0000 UTC m=+3867.877019450" watchObservedRunningTime="2026-03-10 16:10:03.173208073 +0000 UTC m=+3867.880022821" Mar 10 16:10:04 crc kubenswrapper[4743]: I0310 16:10:04.162111 4743 generic.go:334] "Generic (PLEG): container finished" podID="db1fe291-64d9-432e-9203-bdbdfd4e7a5f" containerID="9181d5716c48e3dc26fd7e66efacbafd11ac9d8522a8a5e22e4a422bf00f13d5" exitCode=0 Mar 10 16:10:04 crc kubenswrapper[4743]: I0310 16:10:04.162164 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" event={"ID":"db1fe291-64d9-432e-9203-bdbdfd4e7a5f","Type":"ContainerDied","Data":"9181d5716c48e3dc26fd7e66efacbafd11ac9d8522a8a5e22e4a422bf00f13d5"} Mar 10 16:10:05 crc kubenswrapper[4743]: I0310 16:10:05.802039 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" Mar 10 16:10:05 crc kubenswrapper[4743]: I0310 16:10:05.896715 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt2pw\" (UniqueName: \"kubernetes.io/projected/db1fe291-64d9-432e-9203-bdbdfd4e7a5f-kube-api-access-nt2pw\") pod \"db1fe291-64d9-432e-9203-bdbdfd4e7a5f\" (UID: \"db1fe291-64d9-432e-9203-bdbdfd4e7a5f\") " Mar 10 16:10:05 crc kubenswrapper[4743]: I0310 16:10:05.902557 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1fe291-64d9-432e-9203-bdbdfd4e7a5f-kube-api-access-nt2pw" (OuterVolumeSpecName: "kube-api-access-nt2pw") pod "db1fe291-64d9-432e-9203-bdbdfd4e7a5f" (UID: "db1fe291-64d9-432e-9203-bdbdfd4e7a5f"). InnerVolumeSpecName "kube-api-access-nt2pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:06 crc kubenswrapper[4743]: I0310 16:10:06.003139 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt2pw\" (UniqueName: \"kubernetes.io/projected/db1fe291-64d9-432e-9203-bdbdfd4e7a5f-kube-api-access-nt2pw\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:06 crc kubenswrapper[4743]: I0310 16:10:06.182151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" event={"ID":"db1fe291-64d9-432e-9203-bdbdfd4e7a5f","Type":"ContainerDied","Data":"9c61196901a5b096f6502ab6d0cca8b91b5640d7b13a00fd9cd125915d0c7c89"} Mar 10 16:10:06 crc kubenswrapper[4743]: I0310 16:10:06.182206 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c61196901a5b096f6502ab6d0cca8b91b5640d7b13a00fd9cd125915d0c7c89" Mar 10 16:10:06 crc kubenswrapper[4743]: I0310 16:10:06.182274 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-hdwfz" Mar 10 16:10:06 crc kubenswrapper[4743]: I0310 16:10:06.249130 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-pdw4b"] Mar 10 16:10:06 crc kubenswrapper[4743]: I0310 16:10:06.256995 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-pdw4b"] Mar 10 16:10:06 crc kubenswrapper[4743]: I0310 16:10:06.915313 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:10:06 crc kubenswrapper[4743]: E0310 16:10:06.916113 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:10:07 crc kubenswrapper[4743]: I0310 16:10:07.930919 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cceb81a2-a8b4-418b-9abd-31ea65f24354" path="/var/lib/kubelet/pods/cceb81a2-a8b4-418b-9abd-31ea65f24354/volumes" Mar 10 16:10:15 crc kubenswrapper[4743]: I0310 16:10:15.926408 4743 scope.go:117] "RemoveContainer" containerID="54ed7b5e77de7ed5fe02e6bb9168d0116d1742b0b96abf7c7d0a632b32ee6fc2" Mar 10 16:10:17 crc kubenswrapper[4743]: I0310 16:10:17.915697 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:10:17 crc kubenswrapper[4743]: E0310 16:10:17.916385 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:10:30 crc kubenswrapper[4743]: I0310 16:10:30.915076 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:10:30 crc kubenswrapper[4743]: E0310 16:10:30.915956 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:10:41 crc kubenswrapper[4743]: I0310 16:10:41.915349 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:10:41 crc kubenswrapper[4743]: E0310 16:10:41.916348 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:10:54 crc kubenswrapper[4743]: I0310 16:10:54.916063 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:10:54 crc kubenswrapper[4743]: E0310 16:10:54.918511 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:11:09 crc kubenswrapper[4743]: I0310 16:11:09.917067 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:11:09 crc kubenswrapper[4743]: E0310 16:11:09.917741 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:11:21 crc kubenswrapper[4743]: I0310 16:11:21.918347 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:11:21 crc kubenswrapper[4743]: E0310 16:11:21.919250 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:11:36 crc kubenswrapper[4743]: I0310 16:11:36.915735 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:11:36 crc kubenswrapper[4743]: E0310 16:11:36.917726 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:11:47 crc kubenswrapper[4743]: I0310 16:11:47.915676 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:11:47 crc kubenswrapper[4743]: E0310 16:11:47.916459 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:11:59 crc kubenswrapper[4743]: I0310 16:11:59.915839 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:11:59 crc kubenswrapper[4743]: E0310 16:11:59.916694 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.210062 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552652-zqdwx"] Mar 10 16:12:00 crc kubenswrapper[4743]: E0310 16:12:00.210983 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1fe291-64d9-432e-9203-bdbdfd4e7a5f" containerName="oc" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.211007 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1fe291-64d9-432e-9203-bdbdfd4e7a5f" containerName="oc" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.211268 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1fe291-64d9-432e-9203-bdbdfd4e7a5f" containerName="oc" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.212097 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-zqdwx" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.215296 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.215560 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.216627 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.239426 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-zqdwx"] Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.288093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9nfr\" (UniqueName: \"kubernetes.io/projected/7b8ca985-8e46-4dc1-b71d-8d99b4f88b59-kube-api-access-b9nfr\") pod \"auto-csr-approver-29552652-zqdwx\" (UID: \"7b8ca985-8e46-4dc1-b71d-8d99b4f88b59\") " pod="openshift-infra/auto-csr-approver-29552652-zqdwx" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.393389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9nfr\" (UniqueName: \"kubernetes.io/projected/7b8ca985-8e46-4dc1-b71d-8d99b4f88b59-kube-api-access-b9nfr\") pod \"auto-csr-approver-29552652-zqdwx\" (UID: \"7b8ca985-8e46-4dc1-b71d-8d99b4f88b59\") " pod="openshift-infra/auto-csr-approver-29552652-zqdwx" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.436924 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9nfr\" (UniqueName: \"kubernetes.io/projected/7b8ca985-8e46-4dc1-b71d-8d99b4f88b59-kube-api-access-b9nfr\") pod \"auto-csr-approver-29552652-zqdwx\" (UID: \"7b8ca985-8e46-4dc1-b71d-8d99b4f88b59\") " pod="openshift-infra/auto-csr-approver-29552652-zqdwx" Mar 10 16:12:00 crc kubenswrapper[4743]: I0310 16:12:00.539384 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-zqdwx" Mar 10 16:12:01 crc kubenswrapper[4743]: I0310 16:12:01.360716 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-zqdwx"] Mar 10 16:12:01 crc kubenswrapper[4743]: I0310 16:12:01.367734 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:12:02 crc kubenswrapper[4743]: I0310 16:12:02.300089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-zqdwx" event={"ID":"7b8ca985-8e46-4dc1-b71d-8d99b4f88b59","Type":"ContainerStarted","Data":"ad0219940ebc211036a2d6ff64d4d1f08886f6b26753e8fa5ea3896d74570f85"} Mar 10 16:12:03 crc kubenswrapper[4743]: I0310 16:12:03.310563 4743 generic.go:334] "Generic (PLEG): container finished" podID="7b8ca985-8e46-4dc1-b71d-8d99b4f88b59" containerID="fb86ae6174b15273f6976480fd9b17f20528604c510527f878d6a7d89b52bb9b" exitCode=0 Mar 10 16:12:03 crc kubenswrapper[4743]: I0310 16:12:03.310844 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-zqdwx" event={"ID":"7b8ca985-8e46-4dc1-b71d-8d99b4f88b59","Type":"ContainerDied","Data":"fb86ae6174b15273f6976480fd9b17f20528604c510527f878d6a7d89b52bb9b"} Mar 10 16:12:05 crc kubenswrapper[4743]: I0310 16:12:05.125648 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-zqdwx" Mar 10 16:12:05 crc kubenswrapper[4743]: I0310 16:12:05.206483 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9nfr\" (UniqueName: \"kubernetes.io/projected/7b8ca985-8e46-4dc1-b71d-8d99b4f88b59-kube-api-access-b9nfr\") pod \"7b8ca985-8e46-4dc1-b71d-8d99b4f88b59\" (UID: \"7b8ca985-8e46-4dc1-b71d-8d99b4f88b59\") " Mar 10 16:12:05 crc kubenswrapper[4743]: I0310 16:12:05.228985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8ca985-8e46-4dc1-b71d-8d99b4f88b59-kube-api-access-b9nfr" (OuterVolumeSpecName: "kube-api-access-b9nfr") pod "7b8ca985-8e46-4dc1-b71d-8d99b4f88b59" (UID: "7b8ca985-8e46-4dc1-b71d-8d99b4f88b59"). InnerVolumeSpecName "kube-api-access-b9nfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:05 crc kubenswrapper[4743]: I0310 16:12:05.308802 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9nfr\" (UniqueName: \"kubernetes.io/projected/7b8ca985-8e46-4dc1-b71d-8d99b4f88b59-kube-api-access-b9nfr\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:05 crc kubenswrapper[4743]: I0310 16:12:05.329368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-zqdwx" event={"ID":"7b8ca985-8e46-4dc1-b71d-8d99b4f88b59","Type":"ContainerDied","Data":"ad0219940ebc211036a2d6ff64d4d1f08886f6b26753e8fa5ea3896d74570f85"} Mar 10 16:12:05 crc kubenswrapper[4743]: I0310 16:12:05.329414 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad0219940ebc211036a2d6ff64d4d1f08886f6b26753e8fa5ea3896d74570f85" Mar 10 16:12:05 crc kubenswrapper[4743]: I0310 16:12:05.329421 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-zqdwx" Mar 10 16:12:06 crc kubenswrapper[4743]: I0310 16:12:06.241494 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-5vsd4"] Mar 10 16:12:06 crc kubenswrapper[4743]: I0310 16:12:06.253279 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-5vsd4"] Mar 10 16:12:07 crc kubenswrapper[4743]: I0310 16:12:07.929499 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2397075-b0df-4433-9997-f2fb1cb0db0e" path="/var/lib/kubelet/pods/f2397075-b0df-4433-9997-f2fb1cb0db0e/volumes" Mar 10 16:12:12 crc kubenswrapper[4743]: I0310 16:12:12.916669 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:12:13 crc kubenswrapper[4743]: I0310 16:12:13.403182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"e35d3955c1caa5088792f1dddbbc2bda2687186cdaf535361f7d997435837e67"} Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.461182 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9m265"] Mar 10 16:12:15 crc kubenswrapper[4743]: E0310 16:12:15.462056 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8ca985-8e46-4dc1-b71d-8d99b4f88b59" containerName="oc" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.462073 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8ca985-8e46-4dc1-b71d-8d99b4f88b59" containerName="oc" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.462280 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8ca985-8e46-4dc1-b71d-8d99b4f88b59" containerName="oc" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.463604 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.485403 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9m265"] Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.519449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xt9h\" (UniqueName: \"kubernetes.io/projected/ec3ced34-2909-472a-a506-400ff6f8b35d-kube-api-access-9xt9h\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.519491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-catalog-content\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.519523 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-utilities\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.621357 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xt9h\" (UniqueName: \"kubernetes.io/projected/ec3ced34-2909-472a-a506-400ff6f8b35d-kube-api-access-9xt9h\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.621407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-catalog-content\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.621434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-utilities\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.622016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-utilities\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.622079 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-catalog-content\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.642887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xt9h\" (UniqueName: \"kubernetes.io/projected/ec3ced34-2909-472a-a506-400ff6f8b35d-kube-api-access-9xt9h\") pod \"certified-operators-9m265\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:15 crc kubenswrapper[4743]: I0310 16:12:15.783012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:16 crc kubenswrapper[4743]: I0310 16:12:16.036296 4743 scope.go:117] "RemoveContainer" containerID="3c34cdafaf2bec0bf4a77f7b3cd4289ea7cce33610e45169d34f7f16c5500c47" Mar 10 16:12:16 crc kubenswrapper[4743]: I0310 16:12:16.518943 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9m265"] Mar 10 16:12:17 crc kubenswrapper[4743]: I0310 16:12:17.445398 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerID="7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f" exitCode=0 Mar 10 16:12:17 crc kubenswrapper[4743]: I0310 16:12:17.445445 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m265" event={"ID":"ec3ced34-2909-472a-a506-400ff6f8b35d","Type":"ContainerDied","Data":"7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f"} Mar 10 16:12:17 crc kubenswrapper[4743]: I0310 16:12:17.445918 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m265" event={"ID":"ec3ced34-2909-472a-a506-400ff6f8b35d","Type":"ContainerStarted","Data":"29e809ace10eee4d1661503e736cda253131d1c7fd9b626c14be20d43a92e8d7"} Mar 10 16:12:18 crc kubenswrapper[4743]: I0310 16:12:18.456002 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m265" event={"ID":"ec3ced34-2909-472a-a506-400ff6f8b35d","Type":"ContainerStarted","Data":"167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9"} Mar 10 16:12:20 crc kubenswrapper[4743]: I0310 16:12:20.476467 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerID="167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9" exitCode=0 Mar 10 16:12:20 crc kubenswrapper[4743]: I0310 16:12:20.476535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m265" event={"ID":"ec3ced34-2909-472a-a506-400ff6f8b35d","Type":"ContainerDied","Data":"167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9"} Mar 10 16:12:21 crc kubenswrapper[4743]: I0310 16:12:21.491369 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m265" event={"ID":"ec3ced34-2909-472a-a506-400ff6f8b35d","Type":"ContainerStarted","Data":"2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40"} Mar 10 16:12:21 crc kubenswrapper[4743]: I0310 16:12:21.515760 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9m265" podStartSLOduration=2.992049356 podStartE2EDuration="6.51573176s" podCreationTimestamp="2026-03-10 16:12:15 +0000 UTC" firstStartedPulling="2026-03-10 16:12:17.448205402 +0000 UTC m=+4002.155020150" lastFinishedPulling="2026-03-10 16:12:20.971887806 +0000 UTC m=+4005.678702554" observedRunningTime="2026-03-10 16:12:21.515701679 +0000 UTC m=+4006.222516437" watchObservedRunningTime="2026-03-10 16:12:21.51573176 +0000 UTC m=+4006.222546508" Mar 10 16:12:25 crc kubenswrapper[4743]: I0310 16:12:25.783942 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:25 crc kubenswrapper[4743]: I0310 16:12:25.784496 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:26 crc kubenswrapper[4743]: I0310 16:12:26.833695 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9m265" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="registry-server" probeResult="failure" output=< Mar 10 16:12:26 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 16:12:26 crc kubenswrapper[4743]: > Mar 10 16:12:35 crc kubenswrapper[4743]: I0310 16:12:35.888996 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:35 crc kubenswrapper[4743]: I0310 16:12:35.947123 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:36 crc kubenswrapper[4743]: I0310 16:12:36.124878 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9m265"] Mar 10 16:12:37 crc kubenswrapper[4743]: I0310 16:12:37.628685 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9m265" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="registry-server" containerID="cri-o://2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40" gracePeriod=2 Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.521319 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.628206 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-utilities\") pod \"ec3ced34-2909-472a-a506-400ff6f8b35d\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.628511 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-catalog-content\") pod \"ec3ced34-2909-472a-a506-400ff6f8b35d\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.628617 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xt9h\" (UniqueName: \"kubernetes.io/projected/ec3ced34-2909-472a-a506-400ff6f8b35d-kube-api-access-9xt9h\") pod \"ec3ced34-2909-472a-a506-400ff6f8b35d\" (UID: \"ec3ced34-2909-472a-a506-400ff6f8b35d\") " Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.629285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-utilities" (OuterVolumeSpecName: "utilities") pod "ec3ced34-2909-472a-a506-400ff6f8b35d" (UID: "ec3ced34-2909-472a-a506-400ff6f8b35d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.636128 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec3ced34-2909-472a-a506-400ff6f8b35d-kube-api-access-9xt9h" (OuterVolumeSpecName: "kube-api-access-9xt9h") pod "ec3ced34-2909-472a-a506-400ff6f8b35d" (UID: "ec3ced34-2909-472a-a506-400ff6f8b35d"). InnerVolumeSpecName "kube-api-access-9xt9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.641272 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerID="2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40" exitCode=0 Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.641318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m265" event={"ID":"ec3ced34-2909-472a-a506-400ff6f8b35d","Type":"ContainerDied","Data":"2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40"} Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.641344 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m265" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.641369 4743 scope.go:117] "RemoveContainer" containerID="2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.641356 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m265" event={"ID":"ec3ced34-2909-472a-a506-400ff6f8b35d","Type":"ContainerDied","Data":"29e809ace10eee4d1661503e736cda253131d1c7fd9b626c14be20d43a92e8d7"} Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.695241 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec3ced34-2909-472a-a506-400ff6f8b35d" (UID: "ec3ced34-2909-472a-a506-400ff6f8b35d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.697044 4743 scope.go:117] "RemoveContainer" containerID="167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.717987 4743 scope.go:117] "RemoveContainer" containerID="7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.731098 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.731131 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec3ced34-2909-472a-a506-400ff6f8b35d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.731143 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xt9h\" (UniqueName: \"kubernetes.io/projected/ec3ced34-2909-472a-a506-400ff6f8b35d-kube-api-access-9xt9h\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.770328 4743 scope.go:117] "RemoveContainer" containerID="2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40" Mar 10 16:12:38 crc kubenswrapper[4743]: E0310 16:12:38.773650 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40\": container with ID starting with 2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40 not found: ID does not exist" containerID="2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.773693 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40"} err="failed to get container status \"2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40\": rpc error: code = NotFound desc = could not find container \"2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40\": container with ID starting with 2101a3048b458880a89e75f913201bfb2af19a3984d555a51a34aad2fdf0fb40 not found: ID does not exist" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.773719 4743 scope.go:117] "RemoveContainer" containerID="167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9" Mar 10 16:12:38 crc kubenswrapper[4743]: E0310 16:12:38.773989 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9\": container with ID starting with 167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9 not found: ID does not exist" containerID="167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.774013 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9"} err="failed to get container status \"167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9\": rpc error: code = NotFound desc = could not find container \"167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9\": container with ID starting with 167800b78d43e4017e8336c60f6884787030fa2bfe6599ba291be4d71aaa46b9 not found: ID does not exist" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.774028 4743 scope.go:117] "RemoveContainer" containerID="7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f" Mar 10 16:12:38 crc kubenswrapper[4743]: E0310 16:12:38.774579 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f\": container with ID starting with 7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f not found: ID does not exist" containerID="7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f" Mar 10 16:12:38 crc kubenswrapper[4743]: I0310 16:12:38.774602 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f"} err="failed to get container status \"7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f\": rpc error: code = NotFound desc = could not find container \"7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f\": container with ID starting with 7ae67a21f88f4a511ec2396a2c8bac5bd0d0e2c9ccb0cea9810faab242439b4f not found: ID does not exist" Mar 10 16:12:39 crc kubenswrapper[4743]: I0310 16:12:39.033454 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9m265"] Mar 10 16:12:39 crc kubenswrapper[4743]: I0310 16:12:39.053379 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9m265"] Mar 10 16:12:39 crc kubenswrapper[4743]: I0310 16:12:39.927291 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" path="/var/lib/kubelet/pods/ec3ced34-2909-472a-a506-400ff6f8b35d/volumes" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.148078 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552654-2sknh"] Mar 10 16:14:00 crc kubenswrapper[4743]: E0310 16:14:00.149138 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="extract-content" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.149158 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="extract-content" Mar 10 16:14:00 crc kubenswrapper[4743]: E0310 16:14:00.149176 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="registry-server" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.149183 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="registry-server" Mar 10 16:14:00 crc kubenswrapper[4743]: E0310 16:14:00.149195 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="extract-utilities" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.149201 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="extract-utilities" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.149383 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3ced34-2909-472a-a506-400ff6f8b35d" containerName="registry-server" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.150067 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552654-2sknh" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.153181 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.153205 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.153333 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.165524 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552654-2sknh"] Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.324927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z824c\" (UniqueName: \"kubernetes.io/projected/37ad2354-c9aa-47ba-aff5-f37e3d71fcc1-kube-api-access-z824c\") pod \"auto-csr-approver-29552654-2sknh\" (UID: \"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1\") " pod="openshift-infra/auto-csr-approver-29552654-2sknh" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.427243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z824c\" (UniqueName: \"kubernetes.io/projected/37ad2354-c9aa-47ba-aff5-f37e3d71fcc1-kube-api-access-z824c\") pod \"auto-csr-approver-29552654-2sknh\" (UID: \"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1\") " pod="openshift-infra/auto-csr-approver-29552654-2sknh" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.455858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z824c\" (UniqueName: \"kubernetes.io/projected/37ad2354-c9aa-47ba-aff5-f37e3d71fcc1-kube-api-access-z824c\") pod \"auto-csr-approver-29552654-2sknh\" (UID: \"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1\") " pod="openshift-infra/auto-csr-approver-29552654-2sknh" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.475488 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552654-2sknh" Mar 10 16:14:00 crc kubenswrapper[4743]: I0310 16:14:00.968782 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552654-2sknh"] Mar 10 16:14:01 crc kubenswrapper[4743]: I0310 16:14:01.413170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552654-2sknh" event={"ID":"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1","Type":"ContainerStarted","Data":"9e4b80ad6064e01fd2c1e4b09cb7437a7cbfc7de83aec2c3012ac92f1ae4241c"} Mar 10 16:14:02 crc kubenswrapper[4743]: I0310 16:14:02.424205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552654-2sknh" event={"ID":"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1","Type":"ContainerStarted","Data":"dc9165777f504abf0567ac54c6dcd53f3615cab276877ea3f26aa9295cb35176"} Mar 10 16:14:02 crc kubenswrapper[4743]: I0310 16:14:02.438312 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552654-2sknh" podStartSLOduration=1.642594889 podStartE2EDuration="2.438288273s" podCreationTimestamp="2026-03-10 16:14:00 +0000 UTC" firstStartedPulling="2026-03-10 16:14:00.980088355 +0000 UTC m=+4105.686903103" lastFinishedPulling="2026-03-10 16:14:01.775781739 +0000 UTC m=+4106.482596487" observedRunningTime="2026-03-10 16:14:02.434786164 +0000 UTC m=+4107.141600912" watchObservedRunningTime="2026-03-10 16:14:02.438288273 +0000 UTC m=+4107.145103021" Mar 10 16:14:03 crc kubenswrapper[4743]: I0310 16:14:03.441454 4743 generic.go:334] "Generic (PLEG): container finished" podID="37ad2354-c9aa-47ba-aff5-f37e3d71fcc1" containerID="dc9165777f504abf0567ac54c6dcd53f3615cab276877ea3f26aa9295cb35176" exitCode=0 Mar 10 16:14:03 crc kubenswrapper[4743]: I0310 16:14:03.441550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552654-2sknh" event={"ID":"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1","Type":"ContainerDied","Data":"dc9165777f504abf0567ac54c6dcd53f3615cab276877ea3f26aa9295cb35176"} Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.367729 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552654-2sknh" Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.477550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552654-2sknh" event={"ID":"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1","Type":"ContainerDied","Data":"9e4b80ad6064e01fd2c1e4b09cb7437a7cbfc7de83aec2c3012ac92f1ae4241c"} Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.477591 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e4b80ad6064e01fd2c1e4b09cb7437a7cbfc7de83aec2c3012ac92f1ae4241c" Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.477643 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552654-2sknh" Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.508972 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-b2stk"] Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.518190 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-b2stk"] Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.534892 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z824c\" (UniqueName: \"kubernetes.io/projected/37ad2354-c9aa-47ba-aff5-f37e3d71fcc1-kube-api-access-z824c\") pod \"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1\" (UID: \"37ad2354-c9aa-47ba-aff5-f37e3d71fcc1\") " Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.552734 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ad2354-c9aa-47ba-aff5-f37e3d71fcc1-kube-api-access-z824c" (OuterVolumeSpecName: "kube-api-access-z824c") pod "37ad2354-c9aa-47ba-aff5-f37e3d71fcc1" (UID: "37ad2354-c9aa-47ba-aff5-f37e3d71fcc1"). InnerVolumeSpecName "kube-api-access-z824c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.636880 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z824c\" (UniqueName: \"kubernetes.io/projected/37ad2354-c9aa-47ba-aff5-f37e3d71fcc1-kube-api-access-z824c\") on node \"crc\" DevicePath \"\"" Mar 10 16:14:05 crc kubenswrapper[4743]: I0310 16:14:05.925907 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f4bd91-dabb-4e73-af48-3b6514b14b2d" path="/var/lib/kubelet/pods/12f4bd91-dabb-4e73-af48-3b6514b14b2d/volumes" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.113499 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vjsv2"] Mar 10 16:14:07 crc kubenswrapper[4743]: E0310 16:14:07.114831 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ad2354-c9aa-47ba-aff5-f37e3d71fcc1" containerName="oc" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.114857 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ad2354-c9aa-47ba-aff5-f37e3d71fcc1" containerName="oc" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.115227 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ad2354-c9aa-47ba-aff5-f37e3d71fcc1" containerName="oc" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.117608 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.126552 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vjsv2"] Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.275564 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-catalog-content\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.276171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsrfc\" (UniqueName: \"kubernetes.io/projected/cf65e211-c17d-458a-aff7-26068fa9116b-kube-api-access-rsrfc\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.276329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-utilities\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.378345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsrfc\" (UniqueName: \"kubernetes.io/projected/cf65e211-c17d-458a-aff7-26068fa9116b-kube-api-access-rsrfc\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.378507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-utilities\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.380803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-utilities\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.383350 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-catalog-content\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.383791 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-catalog-content\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.404685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsrfc\" (UniqueName: \"kubernetes.io/projected/cf65e211-c17d-458a-aff7-26068fa9116b-kube-api-access-rsrfc\") pod \"redhat-operators-vjsv2\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.438289 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:07 crc kubenswrapper[4743]: I0310 16:14:07.987660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vjsv2"] Mar 10 16:14:08 crc kubenswrapper[4743]: I0310 16:14:08.512463 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf65e211-c17d-458a-aff7-26068fa9116b" containerID="158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42" exitCode=0 Mar 10 16:14:08 crc kubenswrapper[4743]: I0310 16:14:08.512717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjsv2" event={"ID":"cf65e211-c17d-458a-aff7-26068fa9116b","Type":"ContainerDied","Data":"158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42"} Mar 10 16:14:08 crc kubenswrapper[4743]: I0310 16:14:08.512741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjsv2" event={"ID":"cf65e211-c17d-458a-aff7-26068fa9116b","Type":"ContainerStarted","Data":"e82990d62ae92ded5dc74872a99e20b07122d8bf48cd95544434ca20b4197f5f"} Mar 10 16:14:10 crc kubenswrapper[4743]: I0310 16:14:10.534194 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjsv2" event={"ID":"cf65e211-c17d-458a-aff7-26068fa9116b","Type":"ContainerStarted","Data":"6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e"} Mar 10 16:14:13 crc kubenswrapper[4743]: I0310 16:14:13.566666 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf65e211-c17d-458a-aff7-26068fa9116b" containerID="6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e" exitCode=0 Mar 10 16:14:13 crc kubenswrapper[4743]: I0310 16:14:13.566746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjsv2" event={"ID":"cf65e211-c17d-458a-aff7-26068fa9116b","Type":"ContainerDied","Data":"6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e"} Mar 10 16:14:15 crc kubenswrapper[4743]: I0310 16:14:15.586395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjsv2" event={"ID":"cf65e211-c17d-458a-aff7-26068fa9116b","Type":"ContainerStarted","Data":"30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746"} Mar 10 16:14:15 crc kubenswrapper[4743]: I0310 16:14:15.603737 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vjsv2" podStartSLOduration=3.053034212 podStartE2EDuration="8.603717823s" podCreationTimestamp="2026-03-10 16:14:07 +0000 UTC" firstStartedPulling="2026-03-10 16:14:08.514580353 +0000 UTC m=+4113.221395101" lastFinishedPulling="2026-03-10 16:14:14.065263964 +0000 UTC m=+4118.772078712" observedRunningTime="2026-03-10 16:14:15.602783727 +0000 UTC m=+4120.309598475" watchObservedRunningTime="2026-03-10 16:14:15.603717823 +0000 UTC m=+4120.310532571" Mar 10 16:14:16 crc kubenswrapper[4743]: I0310 16:14:16.272536 4743 scope.go:117] "RemoveContainer" containerID="22b69878a7c8d41512f590c210f13c1ad0214a6cedc877671ccb27840d1666ae" Mar 10 16:14:17 crc kubenswrapper[4743]: I0310 16:14:17.438594 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:17 crc kubenswrapper[4743]: I0310 16:14:17.439935 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:18 crc kubenswrapper[4743]: I0310 16:14:18.488352 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vjsv2" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="registry-server" probeResult="failure" output=< Mar 10 16:14:18 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 16:14:18 crc kubenswrapper[4743]: > Mar 10 16:14:27 crc kubenswrapper[4743]: I0310 16:14:27.494063 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:27 crc kubenswrapper[4743]: I0310 16:14:27.542539 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:27 crc kubenswrapper[4743]: I0310 16:14:27.740942 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vjsv2"] Mar 10 16:14:28 crc kubenswrapper[4743]: I0310 16:14:28.731280 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vjsv2" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="registry-server" containerID="cri-o://30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746" gracePeriod=2 Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.432673 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.561621 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsrfc\" (UniqueName: \"kubernetes.io/projected/cf65e211-c17d-458a-aff7-26068fa9116b-kube-api-access-rsrfc\") pod \"cf65e211-c17d-458a-aff7-26068fa9116b\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.561975 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-utilities\") pod \"cf65e211-c17d-458a-aff7-26068fa9116b\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.562133 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-catalog-content\") pod \"cf65e211-c17d-458a-aff7-26068fa9116b\" (UID: \"cf65e211-c17d-458a-aff7-26068fa9116b\") " Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.564573 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-utilities" (OuterVolumeSpecName: "utilities") pod "cf65e211-c17d-458a-aff7-26068fa9116b" (UID: "cf65e211-c17d-458a-aff7-26068fa9116b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.568800 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf65e211-c17d-458a-aff7-26068fa9116b-kube-api-access-rsrfc" (OuterVolumeSpecName: "kube-api-access-rsrfc") pod "cf65e211-c17d-458a-aff7-26068fa9116b" (UID: "cf65e211-c17d-458a-aff7-26068fa9116b"). InnerVolumeSpecName "kube-api-access-rsrfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.665719 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.665769 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsrfc\" (UniqueName: \"kubernetes.io/projected/cf65e211-c17d-458a-aff7-26068fa9116b-kube-api-access-rsrfc\") on node \"crc\" DevicePath \"\"" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.733424 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf65e211-c17d-458a-aff7-26068fa9116b" (UID: "cf65e211-c17d-458a-aff7-26068fa9116b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.741156 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf65e211-c17d-458a-aff7-26068fa9116b" containerID="30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746" exitCode=0 Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.741225 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjsv2" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.741207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjsv2" event={"ID":"cf65e211-c17d-458a-aff7-26068fa9116b","Type":"ContainerDied","Data":"30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746"} Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.741379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjsv2" event={"ID":"cf65e211-c17d-458a-aff7-26068fa9116b","Type":"ContainerDied","Data":"e82990d62ae92ded5dc74872a99e20b07122d8bf48cd95544434ca20b4197f5f"} Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.741406 4743 scope.go:117] "RemoveContainer" containerID="30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.766704 4743 scope.go:117] "RemoveContainer" containerID="6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.767695 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf65e211-c17d-458a-aff7-26068fa9116b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.777710 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vjsv2"] Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.788589 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vjsv2"] Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.793677 4743 scope.go:117] "RemoveContainer" containerID="158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.834784 4743 scope.go:117] "RemoveContainer" containerID="30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746" Mar 10 16:14:29 crc kubenswrapper[4743]: E0310 16:14:29.835427 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746\": container with ID starting with 30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746 not found: ID does not exist" containerID="30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.835518 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746"} err="failed to get container status \"30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746\": rpc error: code = NotFound desc = could not find container \"30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746\": container with ID starting with 30b6bae6ac2bf2292d6288e3b1c204e76651467c30d48cefbbb6f95e42770746 not found: ID does not exist" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.835554 4743 scope.go:117] "RemoveContainer" containerID="6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e" Mar 10 16:14:29 crc kubenswrapper[4743]: E0310 16:14:29.835907 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e\": container with ID starting with 6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e not found: ID does not exist" containerID="6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.835936 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e"} err="failed to get container status \"6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e\": rpc error: code = NotFound desc = could not find container \"6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e\": container with ID starting with 6e1c921af72bf18fa12f2a8886c830172deab73415db9d413eb891148f5bbf1e not found: ID does not exist" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.835958 4743 scope.go:117] "RemoveContainer" containerID="158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42" Mar 10 16:14:29 crc kubenswrapper[4743]: E0310 16:14:29.836213 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42\": container with ID starting with 158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42 not found: ID does not exist" containerID="158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.836254 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42"} err="failed to get container status \"158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42\": rpc error: code = NotFound desc = could not find container \"158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42\": container with ID starting with 158de9a1fd53b1a58189f02b240a2cd8370dfa760b9ea87bee0f46217b429b42 not found: ID does not exist" Mar 10 16:14:29 crc kubenswrapper[4743]: I0310 16:14:29.926757 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" path="/var/lib/kubelet/pods/cf65e211-c17d-458a-aff7-26068fa9116b/volumes" Mar 10 16:14:41 crc kubenswrapper[4743]: I0310 16:14:41.253244 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:14:41 crc kubenswrapper[4743]: I0310 16:14:41.253882 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.150784 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8"] Mar 10 16:15:00 crc kubenswrapper[4743]: E0310 16:15:00.151999 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="extract-utilities" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.152019 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="extract-utilities" Mar 10 16:15:00 crc kubenswrapper[4743]: E0310 16:15:00.152041 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="registry-server" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.152049 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="registry-server" Mar 10 16:15:00 crc kubenswrapper[4743]: E0310 16:15:00.152080 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="extract-content" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.152088 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="extract-content" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.152320 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf65e211-c17d-458a-aff7-26068fa9116b" containerName="registry-server" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.153342 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.156004 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.156045 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.171219 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8"] Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.289391 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc9847d-7547-492a-af94-694e20c44775-config-volume\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.289436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6zw5\" (UniqueName: \"kubernetes.io/projected/1bc9847d-7547-492a-af94-694e20c44775-kube-api-access-s6zw5\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.289478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc9847d-7547-492a-af94-694e20c44775-secret-volume\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.391460 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc9847d-7547-492a-af94-694e20c44775-secret-volume\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.391649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc9847d-7547-492a-af94-694e20c44775-config-volume\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.391673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6zw5\" (UniqueName: \"kubernetes.io/projected/1bc9847d-7547-492a-af94-694e20c44775-kube-api-access-s6zw5\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.392843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc9847d-7547-492a-af94-694e20c44775-config-volume\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.410160 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6zw5\" (UniqueName: \"kubernetes.io/projected/1bc9847d-7547-492a-af94-694e20c44775-kube-api-access-s6zw5\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.412122 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc9847d-7547-492a-af94-694e20c44775-secret-volume\") pod \"collect-profiles-29552655-jntz8\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.475333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:00 crc kubenswrapper[4743]: I0310 16:15:00.934207 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8"] Mar 10 16:15:01 crc kubenswrapper[4743]: I0310 16:15:01.016951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" event={"ID":"1bc9847d-7547-492a-af94-694e20c44775","Type":"ContainerStarted","Data":"12a7788a7a317448f0bbbf39de93a43105cd1a030197030b73f3dc75739a4031"} Mar 10 16:15:02 crc kubenswrapper[4743]: I0310 16:15:02.031078 4743 generic.go:334] "Generic (PLEG): container finished" podID="1bc9847d-7547-492a-af94-694e20c44775" containerID="6cebc824cc73dd8d95a5de0f9cf93384fec230f577ebb3c16eb2f5c443da83d4" exitCode=0 Mar 10 16:15:02 crc kubenswrapper[4743]: I0310 16:15:02.031135 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" event={"ID":"1bc9847d-7547-492a-af94-694e20c44775","Type":"ContainerDied","Data":"6cebc824cc73dd8d95a5de0f9cf93384fec230f577ebb3c16eb2f5c443da83d4"} Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.670667 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.779517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc9847d-7547-492a-af94-694e20c44775-secret-volume\") pod \"1bc9847d-7547-492a-af94-694e20c44775\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.779605 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6zw5\" (UniqueName: \"kubernetes.io/projected/1bc9847d-7547-492a-af94-694e20c44775-kube-api-access-s6zw5\") pod \"1bc9847d-7547-492a-af94-694e20c44775\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.779658 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc9847d-7547-492a-af94-694e20c44775-config-volume\") pod \"1bc9847d-7547-492a-af94-694e20c44775\" (UID: \"1bc9847d-7547-492a-af94-694e20c44775\") " Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.780741 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc9847d-7547-492a-af94-694e20c44775-config-volume" (OuterVolumeSpecName: "config-volume") pod "1bc9847d-7547-492a-af94-694e20c44775" (UID: "1bc9847d-7547-492a-af94-694e20c44775"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.786027 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc9847d-7547-492a-af94-694e20c44775-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1bc9847d-7547-492a-af94-694e20c44775" (UID: "1bc9847d-7547-492a-af94-694e20c44775"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.788074 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc9847d-7547-492a-af94-694e20c44775-kube-api-access-s6zw5" (OuterVolumeSpecName: "kube-api-access-s6zw5") pod "1bc9847d-7547-492a-af94-694e20c44775" (UID: "1bc9847d-7547-492a-af94-694e20c44775"). InnerVolumeSpecName "kube-api-access-s6zw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.881852 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6zw5\" (UniqueName: \"kubernetes.io/projected/1bc9847d-7547-492a-af94-694e20c44775-kube-api-access-s6zw5\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.882145 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bc9847d-7547-492a-af94-694e20c44775-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:03 crc kubenswrapper[4743]: I0310 16:15:03.882230 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bc9847d-7547-492a-af94-694e20c44775-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:04 crc kubenswrapper[4743]: I0310 16:15:04.052424 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" Mar 10 16:15:04 crc kubenswrapper[4743]: I0310 16:15:04.052466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-jntz8" event={"ID":"1bc9847d-7547-492a-af94-694e20c44775","Type":"ContainerDied","Data":"12a7788a7a317448f0bbbf39de93a43105cd1a030197030b73f3dc75739a4031"} Mar 10 16:15:04 crc kubenswrapper[4743]: I0310 16:15:04.053181 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12a7788a7a317448f0bbbf39de93a43105cd1a030197030b73f3dc75739a4031" Mar 10 16:15:04 crc kubenswrapper[4743]: I0310 16:15:04.750861 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v"] Mar 10 16:15:04 crc kubenswrapper[4743]: I0310 16:15:04.762565 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552610-7885v"] Mar 10 16:15:05 crc kubenswrapper[4743]: I0310 16:15:05.927926 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677b6149-3bf9-45ee-938e-783742deb6dd" path="/var/lib/kubelet/pods/677b6149-3bf9-45ee-938e-783742deb6dd/volumes" Mar 10 16:15:11 crc kubenswrapper[4743]: I0310 16:15:11.252936 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:15:11 crc kubenswrapper[4743]: I0310 16:15:11.253628 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:15:16 crc kubenswrapper[4743]: I0310 16:15:16.719649 4743 scope.go:117] "RemoveContainer" containerID="9188259f4dcb6189e963b475c9e9b42e1ee5208a091ed135bbfbe4b78460f2ac" Mar 10 16:15:41 crc kubenswrapper[4743]: I0310 16:15:41.252896 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:15:41 crc kubenswrapper[4743]: I0310 16:15:41.253678 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:15:41 crc kubenswrapper[4743]: I0310 16:15:41.253761 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 16:15:41 crc kubenswrapper[4743]: I0310 16:15:41.255140 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e35d3955c1caa5088792f1dddbbc2bda2687186cdaf535361f7d997435837e67"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:15:41 crc kubenswrapper[4743]: I0310 16:15:41.255278 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://e35d3955c1caa5088792f1dddbbc2bda2687186cdaf535361f7d997435837e67" gracePeriod=600 Mar 10 16:15:41 crc kubenswrapper[4743]: I0310 16:15:41.413718 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="e35d3955c1caa5088792f1dddbbc2bda2687186cdaf535361f7d997435837e67" exitCode=0 Mar 10 16:15:41 crc kubenswrapper[4743]: I0310 16:15:41.413788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"e35d3955c1caa5088792f1dddbbc2bda2687186cdaf535361f7d997435837e67"} Mar 10 16:15:41 crc kubenswrapper[4743]: I0310 16:15:41.413861 4743 scope.go:117] "RemoveContainer" containerID="d972bc1b490f15bc7cf41d0074c12ac24eea634715d622ba4ed4c4fa85f2ac04" Mar 10 16:15:42 crc kubenswrapper[4743]: I0310 16:15:42.439289 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862"} Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.152373 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552656-gfgnw"] Mar 10 16:16:00 crc kubenswrapper[4743]: E0310 16:16:00.153319 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc9847d-7547-492a-af94-694e20c44775" containerName="collect-profiles" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.153335 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc9847d-7547-492a-af94-694e20c44775" containerName="collect-profiles" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.153516 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc9847d-7547-492a-af94-694e20c44775" containerName="collect-profiles" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.154310 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552656-gfgnw" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.156516 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.158721 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.158960 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.175638 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552656-gfgnw"] Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.350131 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vmvq\" (UniqueName: \"kubernetes.io/projected/f6d2fdff-e1d9-4b41-b6c2-42489002e915-kube-api-access-9vmvq\") pod \"auto-csr-approver-29552656-gfgnw\" (UID: \"f6d2fdff-e1d9-4b41-b6c2-42489002e915\") " pod="openshift-infra/auto-csr-approver-29552656-gfgnw" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.452322 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vmvq\" (UniqueName: \"kubernetes.io/projected/f6d2fdff-e1d9-4b41-b6c2-42489002e915-kube-api-access-9vmvq\") pod \"auto-csr-approver-29552656-gfgnw\" (UID: \"f6d2fdff-e1d9-4b41-b6c2-42489002e915\") " pod="openshift-infra/auto-csr-approver-29552656-gfgnw" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.481280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vmvq\" (UniqueName: \"kubernetes.io/projected/f6d2fdff-e1d9-4b41-b6c2-42489002e915-kube-api-access-9vmvq\") pod \"auto-csr-approver-29552656-gfgnw\" (UID: \"f6d2fdff-e1d9-4b41-b6c2-42489002e915\") " pod="openshift-infra/auto-csr-approver-29552656-gfgnw" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.485096 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552656-gfgnw" Mar 10 16:16:00 crc kubenswrapper[4743]: I0310 16:16:00.986394 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552656-gfgnw"] Mar 10 16:16:01 crc kubenswrapper[4743]: W0310 16:16:01.366434 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d2fdff_e1d9_4b41_b6c2_42489002e915.slice/crio-6f3de73051e2b603e383ffe58cf1d470809e83aeca31a4ee5fc7174296da8aba WatchSource:0}: Error finding container 6f3de73051e2b603e383ffe58cf1d470809e83aeca31a4ee5fc7174296da8aba: Status 404 returned error can't find the container with id 6f3de73051e2b603e383ffe58cf1d470809e83aeca31a4ee5fc7174296da8aba Mar 10 16:16:01 crc kubenswrapper[4743]: I0310 16:16:01.647960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552656-gfgnw" event={"ID":"f6d2fdff-e1d9-4b41-b6c2-42489002e915","Type":"ContainerStarted","Data":"6f3de73051e2b603e383ffe58cf1d470809e83aeca31a4ee5fc7174296da8aba"} Mar 10 16:16:03 crc kubenswrapper[4743]: I0310 16:16:03.666874 4743 generic.go:334] "Generic (PLEG): container finished" podID="f6d2fdff-e1d9-4b41-b6c2-42489002e915" containerID="bf0a9f67ba185ecdeac1ad334a8e37ff2322ba9cdcf76e6c6ad59fb4a85c5de8" exitCode=0 Mar 10 16:16:03 crc kubenswrapper[4743]: I0310 16:16:03.666975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552656-gfgnw" event={"ID":"f6d2fdff-e1d9-4b41-b6c2-42489002e915","Type":"ContainerDied","Data":"bf0a9f67ba185ecdeac1ad334a8e37ff2322ba9cdcf76e6c6ad59fb4a85c5de8"} Mar 10 16:16:05 crc kubenswrapper[4743]: I0310 16:16:05.193704 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552656-gfgnw" Mar 10 16:16:05 crc kubenswrapper[4743]: I0310 16:16:05.347471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vmvq\" (UniqueName: \"kubernetes.io/projected/f6d2fdff-e1d9-4b41-b6c2-42489002e915-kube-api-access-9vmvq\") pod \"f6d2fdff-e1d9-4b41-b6c2-42489002e915\" (UID: \"f6d2fdff-e1d9-4b41-b6c2-42489002e915\") " Mar 10 16:16:05 crc kubenswrapper[4743]: I0310 16:16:05.358999 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d2fdff-e1d9-4b41-b6c2-42489002e915-kube-api-access-9vmvq" (OuterVolumeSpecName: "kube-api-access-9vmvq") pod "f6d2fdff-e1d9-4b41-b6c2-42489002e915" (UID: "f6d2fdff-e1d9-4b41-b6c2-42489002e915"). InnerVolumeSpecName "kube-api-access-9vmvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:16:05 crc kubenswrapper[4743]: I0310 16:16:05.450152 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vmvq\" (UniqueName: \"kubernetes.io/projected/f6d2fdff-e1d9-4b41-b6c2-42489002e915-kube-api-access-9vmvq\") on node \"crc\" DevicePath \"\"" Mar 10 16:16:05 crc kubenswrapper[4743]: I0310 16:16:05.685913 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552656-gfgnw" event={"ID":"f6d2fdff-e1d9-4b41-b6c2-42489002e915","Type":"ContainerDied","Data":"6f3de73051e2b603e383ffe58cf1d470809e83aeca31a4ee5fc7174296da8aba"} Mar 10 16:16:05 crc kubenswrapper[4743]: I0310 16:16:05.685952 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3de73051e2b603e383ffe58cf1d470809e83aeca31a4ee5fc7174296da8aba" Mar 10 16:16:05 crc kubenswrapper[4743]: I0310 16:16:05.686003 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552656-gfgnw" Mar 10 16:16:06 crc kubenswrapper[4743]: I0310 16:16:06.275730 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-hdwfz"] Mar 10 16:16:06 crc kubenswrapper[4743]: I0310 16:16:06.285786 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-hdwfz"] Mar 10 16:16:07 crc kubenswrapper[4743]: I0310 16:16:07.929756 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1fe291-64d9-432e-9203-bdbdfd4e7a5f" path="/var/lib/kubelet/pods/db1fe291-64d9-432e-9203-bdbdfd4e7a5f/volumes" Mar 10 16:16:16 crc kubenswrapper[4743]: I0310 16:16:16.826002 4743 scope.go:117] "RemoveContainer" containerID="9181d5716c48e3dc26fd7e66efacbafd11ac9d8522a8a5e22e4a422bf00f13d5" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.502229 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dv2ql"] Mar 10 16:17:38 crc kubenswrapper[4743]: E0310 16:17:38.503394 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d2fdff-e1d9-4b41-b6c2-42489002e915" containerName="oc" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.503409 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d2fdff-e1d9-4b41-b6c2-42489002e915" containerName="oc" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.503596 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d2fdff-e1d9-4b41-b6c2-42489002e915" containerName="oc" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.505339 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.523722 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dv2ql"] Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.597628 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-utilities\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.597867 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-catalog-content\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.597976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78s6b\" (UniqueName: \"kubernetes.io/projected/cc79320b-71ad-4d8a-b54f-f1fa508ec934-kube-api-access-78s6b\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.700008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-utilities\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.700408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-catalog-content\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.700524 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78s6b\" (UniqueName: \"kubernetes.io/projected/cc79320b-71ad-4d8a-b54f-f1fa508ec934-kube-api-access-78s6b\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.700714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-utilities\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.700865 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-catalog-content\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.735979 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78s6b\" (UniqueName: \"kubernetes.io/projected/cc79320b-71ad-4d8a-b54f-f1fa508ec934-kube-api-access-78s6b\") pod \"community-operators-dv2ql\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:38 crc kubenswrapper[4743]: I0310 16:17:38.826070 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:39 crc kubenswrapper[4743]: I0310 16:17:39.394047 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dv2ql"] Mar 10 16:17:39 crc kubenswrapper[4743]: W0310 16:17:39.473651 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc79320b_71ad_4d8a_b54f_f1fa508ec934.slice/crio-9e0ddf5c3afac555306e3387688c85c1b572f06e770e3f8e991ff5969b387fa8 WatchSource:0}: Error finding container 9e0ddf5c3afac555306e3387688c85c1b572f06e770e3f8e991ff5969b387fa8: Status 404 returned error can't find the container with id 9e0ddf5c3afac555306e3387688c85c1b572f06e770e3f8e991ff5969b387fa8 Mar 10 16:17:39 crc kubenswrapper[4743]: I0310 16:17:39.560409 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2ql" event={"ID":"cc79320b-71ad-4d8a-b54f-f1fa508ec934","Type":"ContainerStarted","Data":"9e0ddf5c3afac555306e3387688c85c1b572f06e770e3f8e991ff5969b387fa8"} Mar 10 16:17:40 crc kubenswrapper[4743]: I0310 16:17:40.575859 4743 generic.go:334] "Generic (PLEG): container finished" podID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerID="09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff" exitCode=0 Mar 10 16:17:40 crc kubenswrapper[4743]: I0310 16:17:40.576053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2ql" event={"ID":"cc79320b-71ad-4d8a-b54f-f1fa508ec934","Type":"ContainerDied","Data":"09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff"} Mar 10 16:17:40 crc kubenswrapper[4743]: I0310 16:17:40.580075 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.100554 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4t688"] Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.102830 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.124389 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t688"] Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.160246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-catalog-content\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.160308 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-utilities\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.160748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cchm\" (UniqueName: \"kubernetes.io/projected/566926cf-f903-433a-8693-0ee4c8bf62c5-kube-api-access-5cchm\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.252363 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.252424 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.262441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-catalog-content\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.262514 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-utilities\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.262744 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cchm\" (UniqueName: \"kubernetes.io/projected/566926cf-f903-433a-8693-0ee4c8bf62c5-kube-api-access-5cchm\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.263056 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-utilities\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.263053 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-catalog-content\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.291339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cchm\" (UniqueName: \"kubernetes.io/projected/566926cf-f903-433a-8693-0ee4c8bf62c5-kube-api-access-5cchm\") pod \"redhat-marketplace-4t688\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: I0310 16:17:41.421361 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:41 crc kubenswrapper[4743]: W0310 16:17:41.993490 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod566926cf_f903_433a_8693_0ee4c8bf62c5.slice/crio-f7b6707d3e8c35a45a0c2ae2c14926584cf5ad4767d88ea5c09e05891d5650fa WatchSource:0}: Error finding container f7b6707d3e8c35a45a0c2ae2c14926584cf5ad4767d88ea5c09e05891d5650fa: Status 404 returned error can't find the container with id f7b6707d3e8c35a45a0c2ae2c14926584cf5ad4767d88ea5c09e05891d5650fa Mar 10 16:17:42 crc kubenswrapper[4743]: I0310 16:17:42.002587 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t688"] Mar 10 16:17:42 crc kubenswrapper[4743]: I0310 16:17:42.599074 4743 generic.go:334] "Generic (PLEG): container finished" podID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerID="8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a" exitCode=0 Mar 10 16:17:42 crc kubenswrapper[4743]: I0310 16:17:42.599232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t688" event={"ID":"566926cf-f903-433a-8693-0ee4c8bf62c5","Type":"ContainerDied","Data":"8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a"} Mar 10 16:17:42 crc kubenswrapper[4743]: I0310 16:17:42.599855 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t688" event={"ID":"566926cf-f903-433a-8693-0ee4c8bf62c5","Type":"ContainerStarted","Data":"f7b6707d3e8c35a45a0c2ae2c14926584cf5ad4767d88ea5c09e05891d5650fa"} Mar 10 16:17:42 crc kubenswrapper[4743]: I0310 16:17:42.605561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2ql" event={"ID":"cc79320b-71ad-4d8a-b54f-f1fa508ec934","Type":"ContainerStarted","Data":"c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2"} Mar 10 16:17:43 crc kubenswrapper[4743]: I0310 16:17:43.615011 4743 generic.go:334] "Generic (PLEG): container finished" podID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerID="c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2" exitCode=0 Mar 10 16:17:43 crc kubenswrapper[4743]: I0310 16:17:43.615236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2ql" event={"ID":"cc79320b-71ad-4d8a-b54f-f1fa508ec934","Type":"ContainerDied","Data":"c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2"} Mar 10 16:17:44 crc kubenswrapper[4743]: I0310 16:17:44.633457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t688" event={"ID":"566926cf-f903-433a-8693-0ee4c8bf62c5","Type":"ContainerStarted","Data":"4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb"} Mar 10 16:17:45 crc kubenswrapper[4743]: I0310 16:17:45.648547 4743 generic.go:334] "Generic (PLEG): container finished" podID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerID="4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb" exitCode=0 Mar 10 16:17:45 crc kubenswrapper[4743]: I0310 16:17:45.648745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t688" event={"ID":"566926cf-f903-433a-8693-0ee4c8bf62c5","Type":"ContainerDied","Data":"4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb"} Mar 10 16:17:45 crc kubenswrapper[4743]: I0310 16:17:45.651653 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2ql" event={"ID":"cc79320b-71ad-4d8a-b54f-f1fa508ec934","Type":"ContainerStarted","Data":"89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95"} Mar 10 16:17:45 crc kubenswrapper[4743]: I0310 16:17:45.702226 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dv2ql" podStartSLOduration=3.7120292729999997 podStartE2EDuration="7.702198901s" podCreationTimestamp="2026-03-10 16:17:38 +0000 UTC" firstStartedPulling="2026-03-10 16:17:40.579750872 +0000 UTC m=+4325.286565620" lastFinishedPulling="2026-03-10 16:17:44.5699205 +0000 UTC m=+4329.276735248" observedRunningTime="2026-03-10 16:17:45.686036223 +0000 UTC m=+4330.392850971" watchObservedRunningTime="2026-03-10 16:17:45.702198901 +0000 UTC m=+4330.409013689" Mar 10 16:17:46 crc kubenswrapper[4743]: I0310 16:17:46.664422 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t688" event={"ID":"566926cf-f903-433a-8693-0ee4c8bf62c5","Type":"ContainerStarted","Data":"830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56"} Mar 10 16:17:46 crc kubenswrapper[4743]: I0310 16:17:46.687520 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4t688" podStartSLOduration=2.243407121 podStartE2EDuration="5.687503772s" podCreationTimestamp="2026-03-10 16:17:41 +0000 UTC" firstStartedPulling="2026-03-10 16:17:42.600756189 +0000 UTC m=+4327.307570937" lastFinishedPulling="2026-03-10 16:17:46.04485284 +0000 UTC m=+4330.751667588" observedRunningTime="2026-03-10 16:17:46.681739809 +0000 UTC m=+4331.388554557" watchObservedRunningTime="2026-03-10 16:17:46.687503772 +0000 UTC m=+4331.394318520" Mar 10 16:17:48 crc kubenswrapper[4743]: I0310 16:17:48.827226 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:48 crc kubenswrapper[4743]: I0310 16:17:48.828928 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:49 crc kubenswrapper[4743]: I0310 16:17:49.873617 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dv2ql" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="registry-server" probeResult="failure" output=< Mar 10 16:17:49 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Mar 10 16:17:49 crc kubenswrapper[4743]: > Mar 10 16:17:51 crc kubenswrapper[4743]: I0310 16:17:51.421502 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:51 crc kubenswrapper[4743]: I0310 16:17:51.421872 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:51 crc kubenswrapper[4743]: I0310 16:17:51.482248 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:51 crc kubenswrapper[4743]: I0310 16:17:51.807472 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:51 crc kubenswrapper[4743]: I0310 16:17:51.890915 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t688"] Mar 10 16:17:53 crc kubenswrapper[4743]: I0310 16:17:53.721013 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4t688" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerName="registry-server" containerID="cri-o://830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56" gracePeriod=2 Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.346490 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.459177 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-utilities\") pod \"566926cf-f903-433a-8693-0ee4c8bf62c5\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.459381 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cchm\" (UniqueName: \"kubernetes.io/projected/566926cf-f903-433a-8693-0ee4c8bf62c5-kube-api-access-5cchm\") pod \"566926cf-f903-433a-8693-0ee4c8bf62c5\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.459888 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-utilities" (OuterVolumeSpecName: "utilities") pod "566926cf-f903-433a-8693-0ee4c8bf62c5" (UID: "566926cf-f903-433a-8693-0ee4c8bf62c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.460686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-catalog-content\") pod \"566926cf-f903-433a-8693-0ee4c8bf62c5\" (UID: \"566926cf-f903-433a-8693-0ee4c8bf62c5\") " Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.461374 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.467100 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/566926cf-f903-433a-8693-0ee4c8bf62c5-kube-api-access-5cchm" (OuterVolumeSpecName: "kube-api-access-5cchm") pod "566926cf-f903-433a-8693-0ee4c8bf62c5" (UID: "566926cf-f903-433a-8693-0ee4c8bf62c5"). InnerVolumeSpecName "kube-api-access-5cchm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.489875 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "566926cf-f903-433a-8693-0ee4c8bf62c5" (UID: "566926cf-f903-433a-8693-0ee4c8bf62c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.563080 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/566926cf-f903-433a-8693-0ee4c8bf62c5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.563124 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cchm\" (UniqueName: \"kubernetes.io/projected/566926cf-f903-433a-8693-0ee4c8bf62c5-kube-api-access-5cchm\") on node \"crc\" DevicePath \"\"" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.731422 4743 generic.go:334] "Generic (PLEG): container finished" podID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerID="830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56" exitCode=0 Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.731464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t688" event={"ID":"566926cf-f903-433a-8693-0ee4c8bf62c5","Type":"ContainerDied","Data":"830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56"} Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.731490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t688" event={"ID":"566926cf-f903-433a-8693-0ee4c8bf62c5","Type":"ContainerDied","Data":"f7b6707d3e8c35a45a0c2ae2c14926584cf5ad4767d88ea5c09e05891d5650fa"} Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.731507 4743 scope.go:117] "RemoveContainer" containerID="830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.731637 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t688" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.768913 4743 scope.go:117] "RemoveContainer" containerID="4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.770708 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t688"] Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.785545 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t688"] Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.797149 4743 scope.go:117] "RemoveContainer" containerID="8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.841438 4743 scope.go:117] "RemoveContainer" containerID="830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56" Mar 10 16:17:54 crc kubenswrapper[4743]: E0310 16:17:54.841881 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56\": container with ID starting with 830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56 not found: ID does not exist" containerID="830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.841946 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56"} err="failed to get container status \"830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56\": rpc error: code = NotFound desc = could not find container \"830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56\": container with ID starting with 830b38694039f4e61086c65eff8fe1404554ad84a87040cd1bdb1ac3457e0a56 not found: ID does not exist" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.841966 4743 scope.go:117] "RemoveContainer" containerID="4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb" Mar 10 16:17:54 crc kubenswrapper[4743]: E0310 16:17:54.842216 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb\": container with ID starting with 4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb not found: ID does not exist" containerID="4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.842274 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb"} err="failed to get container status \"4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb\": rpc error: code = NotFound desc = could not find container \"4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb\": container with ID starting with 4a5a3ef08df733ff5fe0f6e2cce65d60403be53a21d2c74dd22aae487bba52bb not found: ID does not exist" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.842288 4743 scope.go:117] "RemoveContainer" containerID="8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a" Mar 10 16:17:54 crc kubenswrapper[4743]: E0310 16:17:54.842579 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a\": container with ID starting with 8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a not found: ID does not exist" containerID="8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a" Mar 10 16:17:54 crc kubenswrapper[4743]: I0310 16:17:54.842621 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a"} err="failed to get container status \"8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a\": rpc error: code = NotFound desc = could not find container \"8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a\": container with ID starting with 8cc5f642a688430ce19f2a0014122e02cebd19996ea3e3f608536632d92ce47a not found: ID does not exist" Mar 10 16:17:55 crc kubenswrapper[4743]: I0310 16:17:55.929702 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" path="/var/lib/kubelet/pods/566926cf-f903-433a-8693-0ee4c8bf62c5/volumes" Mar 10 16:17:58 crc kubenswrapper[4743]: I0310 16:17:58.894092 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:58 crc kubenswrapper[4743]: I0310 16:17:58.954543 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:17:59 crc kubenswrapper[4743]: I0310 16:17:59.134717 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dv2ql"] Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.161366 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552658-cfkqt"] Mar 10 16:18:00 crc kubenswrapper[4743]: E0310 16:18:00.162483 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerName="registry-server" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.162503 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerName="registry-server" Mar 10 16:18:00 crc kubenswrapper[4743]: E0310 16:18:00.162554 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerName="extract-utilities" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.162563 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerName="extract-utilities" Mar 10 16:18:00 crc kubenswrapper[4743]: E0310 16:18:00.162577 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerName="extract-content" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.162585 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerName="extract-content" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.162861 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="566926cf-f903-433a-8693-0ee4c8bf62c5" containerName="registry-server" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.163783 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552658-cfkqt" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.166002 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.166942 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.166987 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.184009 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552658-cfkqt"] Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.275639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tntwb\" (UniqueName: \"kubernetes.io/projected/111075af-468f-4ff1-a238-55f1cca69653-kube-api-access-tntwb\") pod \"auto-csr-approver-29552658-cfkqt\" (UID: \"111075af-468f-4ff1-a238-55f1cca69653\") " pod="openshift-infra/auto-csr-approver-29552658-cfkqt" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.377709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tntwb\" (UniqueName: \"kubernetes.io/projected/111075af-468f-4ff1-a238-55f1cca69653-kube-api-access-tntwb\") pod \"auto-csr-approver-29552658-cfkqt\" (UID: \"111075af-468f-4ff1-a238-55f1cca69653\") " pod="openshift-infra/auto-csr-approver-29552658-cfkqt" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.660371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tntwb\" (UniqueName: \"kubernetes.io/projected/111075af-468f-4ff1-a238-55f1cca69653-kube-api-access-tntwb\") pod \"auto-csr-approver-29552658-cfkqt\" (UID: \"111075af-468f-4ff1-a238-55f1cca69653\") " pod="openshift-infra/auto-csr-approver-29552658-cfkqt" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.782020 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552658-cfkqt" Mar 10 16:18:00 crc kubenswrapper[4743]: I0310 16:18:00.791146 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dv2ql" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="registry-server" containerID="cri-o://89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95" gracePeriod=2 Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.282652 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552658-cfkqt"] Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.381973 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.404783 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78s6b\" (UniqueName: \"kubernetes.io/projected/cc79320b-71ad-4d8a-b54f-f1fa508ec934-kube-api-access-78s6b\") pod \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.404886 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-catalog-content\") pod \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.404962 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-utilities\") pod \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\" (UID: \"cc79320b-71ad-4d8a-b54f-f1fa508ec934\") " Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.405861 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-utilities" (OuterVolumeSpecName: "utilities") pod "cc79320b-71ad-4d8a-b54f-f1fa508ec934" (UID: "cc79320b-71ad-4d8a-b54f-f1fa508ec934"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.406735 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.412508 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc79320b-71ad-4d8a-b54f-f1fa508ec934-kube-api-access-78s6b" (OuterVolumeSpecName: "kube-api-access-78s6b") pod "cc79320b-71ad-4d8a-b54f-f1fa508ec934" (UID: "cc79320b-71ad-4d8a-b54f-f1fa508ec934"). InnerVolumeSpecName "kube-api-access-78s6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.471629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc79320b-71ad-4d8a-b54f-f1fa508ec934" (UID: "cc79320b-71ad-4d8a-b54f-f1fa508ec934"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.509003 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78s6b\" (UniqueName: \"kubernetes.io/projected/cc79320b-71ad-4d8a-b54f-f1fa508ec934-kube-api-access-78s6b\") on node \"crc\" DevicePath \"\"" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.509037 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc79320b-71ad-4d8a-b54f-f1fa508ec934-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.801582 4743 generic.go:334] "Generic (PLEG): container finished" podID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerID="89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95" exitCode=0 Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.801646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2ql" event={"ID":"cc79320b-71ad-4d8a-b54f-f1fa508ec934","Type":"ContainerDied","Data":"89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95"} Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.801672 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2ql" event={"ID":"cc79320b-71ad-4d8a-b54f-f1fa508ec934","Type":"ContainerDied","Data":"9e0ddf5c3afac555306e3387688c85c1b572f06e770e3f8e991ff5969b387fa8"} Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.801688 4743 scope.go:117] "RemoveContainer" containerID="89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.801836 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dv2ql" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.803573 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552658-cfkqt" event={"ID":"111075af-468f-4ff1-a238-55f1cca69653","Type":"ContainerStarted","Data":"5dfc63c3d92b2e1382dedccc801f9040dd41390f19405176041b43182138ff31"} Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.834573 4743 scope.go:117] "RemoveContainer" containerID="c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.838533 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dv2ql"] Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.847609 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dv2ql"] Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.879013 4743 scope.go:117] "RemoveContainer" containerID="09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.940910 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" path="/var/lib/kubelet/pods/cc79320b-71ad-4d8a-b54f-f1fa508ec934/volumes" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.943110 4743 scope.go:117] "RemoveContainer" containerID="89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95" Mar 10 16:18:01 crc kubenswrapper[4743]: E0310 16:18:01.943713 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95\": container with ID starting with 89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95 not found: ID does not exist" containerID="89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.943750 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95"} err="failed to get container status \"89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95\": rpc error: code = NotFound desc = could not find container \"89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95\": container with ID starting with 89a00f4635bb4633846b68ac0adaf992e3119c5c468299ab21798f144f774a95 not found: ID does not exist" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.943777 4743 scope.go:117] "RemoveContainer" containerID="c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2" Mar 10 16:18:01 crc kubenswrapper[4743]: E0310 16:18:01.944347 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2\": container with ID starting with c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2 not found: ID does not exist" containerID="c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.944371 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2"} err="failed to get container status \"c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2\": rpc error: code = NotFound desc = could not find container \"c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2\": container with ID starting with c4bc406fcfe15e13cd9477dc12a281218fa97b90cbb442ab48b03b24f2e78fa2 not found: ID does not exist" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.944386 4743 scope.go:117] "RemoveContainer" containerID="09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff" Mar 10 16:18:01 crc kubenswrapper[4743]: E0310 16:18:01.944880 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff\": container with ID starting with 09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff not found: ID does not exist" containerID="09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff" Mar 10 16:18:01 crc kubenswrapper[4743]: I0310 16:18:01.944900 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff"} err="failed to get container status \"09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff\": rpc error: code = NotFound desc = could not find container \"09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff\": container with ID starting with 09fa4ae7213c507fb21fa302c3d4b25f3f7669ca1fd020532f7f17827c54c4ff not found: ID does not exist" Mar 10 16:18:02 crc kubenswrapper[4743]: I0310 16:18:02.815955 4743 generic.go:334] "Generic (PLEG): container finished" podID="111075af-468f-4ff1-a238-55f1cca69653" containerID="4228407e53fd3ce0406e23671c6013945c39133d16ee50e6c950f3b74bdf0824" exitCode=0 Mar 10 16:18:02 crc kubenswrapper[4743]: I0310 16:18:02.816008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552658-cfkqt" event={"ID":"111075af-468f-4ff1-a238-55f1cca69653","Type":"ContainerDied","Data":"4228407e53fd3ce0406e23671c6013945c39133d16ee50e6c950f3b74bdf0824"} Mar 10 16:18:04 crc kubenswrapper[4743]: I0310 16:18:04.316329 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552658-cfkqt" Mar 10 16:18:04 crc kubenswrapper[4743]: I0310 16:18:04.368769 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tntwb\" (UniqueName: \"kubernetes.io/projected/111075af-468f-4ff1-a238-55f1cca69653-kube-api-access-tntwb\") pod \"111075af-468f-4ff1-a238-55f1cca69653\" (UID: \"111075af-468f-4ff1-a238-55f1cca69653\") " Mar 10 16:18:04 crc kubenswrapper[4743]: I0310 16:18:04.390061 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111075af-468f-4ff1-a238-55f1cca69653-kube-api-access-tntwb" (OuterVolumeSpecName: "kube-api-access-tntwb") pod "111075af-468f-4ff1-a238-55f1cca69653" (UID: "111075af-468f-4ff1-a238-55f1cca69653"). InnerVolumeSpecName "kube-api-access-tntwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:18:04 crc kubenswrapper[4743]: I0310 16:18:04.471349 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tntwb\" (UniqueName: \"kubernetes.io/projected/111075af-468f-4ff1-a238-55f1cca69653-kube-api-access-tntwb\") on node \"crc\" DevicePath \"\"" Mar 10 16:18:04 crc kubenswrapper[4743]: I0310 16:18:04.846233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552658-cfkqt" event={"ID":"111075af-468f-4ff1-a238-55f1cca69653","Type":"ContainerDied","Data":"5dfc63c3d92b2e1382dedccc801f9040dd41390f19405176041b43182138ff31"} Mar 10 16:18:04 crc kubenswrapper[4743]: I0310 16:18:04.846275 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dfc63c3d92b2e1382dedccc801f9040dd41390f19405176041b43182138ff31" Mar 10 16:18:04 crc kubenswrapper[4743]: I0310 16:18:04.846300 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552658-cfkqt" Mar 10 16:18:05 crc kubenswrapper[4743]: I0310 16:18:05.391863 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-zqdwx"] Mar 10 16:18:05 crc kubenswrapper[4743]: I0310 16:18:05.400163 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-zqdwx"] Mar 10 16:18:05 crc kubenswrapper[4743]: I0310 16:18:05.931267 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8ca985-8e46-4dc1-b71d-8d99b4f88b59" path="/var/lib/kubelet/pods/7b8ca985-8e46-4dc1-b71d-8d99b4f88b59/volumes" Mar 10 16:18:11 crc kubenswrapper[4743]: I0310 16:18:11.252171 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:18:11 crc kubenswrapper[4743]: I0310 16:18:11.252775 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:18:16 crc kubenswrapper[4743]: I0310 16:18:16.938679 4743 scope.go:117] "RemoveContainer" containerID="fb86ae6174b15273f6976480fd9b17f20528604c510527f878d6a7d89b52bb9b" Mar 10 16:18:41 crc kubenswrapper[4743]: I0310 16:18:41.252720 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:18:41 crc kubenswrapper[4743]: I0310 16:18:41.253276 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:18:41 crc kubenswrapper[4743]: I0310 16:18:41.253332 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 16:18:41 crc kubenswrapper[4743]: I0310 16:18:41.254223 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:18:41 crc kubenswrapper[4743]: I0310 16:18:41.254283 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" gracePeriod=600 Mar 10 16:18:41 crc kubenswrapper[4743]: E0310 16:18:41.378316 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:18:42 crc kubenswrapper[4743]: I0310 16:18:42.210479 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" exitCode=0 Mar 10 16:18:42 crc kubenswrapper[4743]: I0310 16:18:42.210546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862"} Mar 10 16:18:42 crc kubenswrapper[4743]: I0310 16:18:42.210768 4743 scope.go:117] "RemoveContainer" containerID="e35d3955c1caa5088792f1dddbbc2bda2687186cdaf535361f7d997435837e67" Mar 10 16:18:42 crc kubenswrapper[4743]: I0310 16:18:42.211486 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:18:42 crc kubenswrapper[4743]: E0310 16:18:42.211724 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:18:56 crc kubenswrapper[4743]: I0310 16:18:56.916438 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:18:56 crc kubenswrapper[4743]: E0310 16:18:56.917289 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:19:09 crc kubenswrapper[4743]: I0310 16:19:09.915630 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:19:09 crc kubenswrapper[4743]: E0310 16:19:09.916943 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:19:24 crc kubenswrapper[4743]: I0310 16:19:24.915471 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:19:24 crc kubenswrapper[4743]: E0310 16:19:24.916275 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:19:39 crc kubenswrapper[4743]: I0310 16:19:39.919763 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:19:39 crc kubenswrapper[4743]: E0310 16:19:39.920605 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:19:50 crc kubenswrapper[4743]: I0310 16:19:50.916299 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:19:50 crc kubenswrapper[4743]: E0310 16:19:50.917180 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.153997 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552660-pgfdt"] Mar 10 16:20:00 crc kubenswrapper[4743]: E0310 16:20:00.155168 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111075af-468f-4ff1-a238-55f1cca69653" containerName="oc" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.155188 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="111075af-468f-4ff1-a238-55f1cca69653" containerName="oc" Mar 10 16:20:00 crc kubenswrapper[4743]: E0310 16:20:00.155207 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="extract-content" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.155215 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="extract-content" Mar 10 16:20:00 crc kubenswrapper[4743]: E0310 16:20:00.155241 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="extract-utilities" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.155250 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="extract-utilities" Mar 10 16:20:00 crc kubenswrapper[4743]: E0310 16:20:00.155267 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="registry-server" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.155287 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="registry-server" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.155543 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc79320b-71ad-4d8a-b54f-f1fa508ec934" containerName="registry-server" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.155554 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="111075af-468f-4ff1-a238-55f1cca69653" containerName="oc" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.156345 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552660-pgfdt" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.161916 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.162043 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.164282 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.168121 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552660-pgfdt"] Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.231171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nlxn\" (UniqueName: \"kubernetes.io/projected/ca109d6c-6ba5-44e6-abf5-506de51e0043-kube-api-access-6nlxn\") pod \"auto-csr-approver-29552660-pgfdt\" (UID: \"ca109d6c-6ba5-44e6-abf5-506de51e0043\") " pod="openshift-infra/auto-csr-approver-29552660-pgfdt" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.332746 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nlxn\" (UniqueName: \"kubernetes.io/projected/ca109d6c-6ba5-44e6-abf5-506de51e0043-kube-api-access-6nlxn\") pod \"auto-csr-approver-29552660-pgfdt\" (UID: \"ca109d6c-6ba5-44e6-abf5-506de51e0043\") " pod="openshift-infra/auto-csr-approver-29552660-pgfdt" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.357148 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nlxn\" (UniqueName: \"kubernetes.io/projected/ca109d6c-6ba5-44e6-abf5-506de51e0043-kube-api-access-6nlxn\") pod \"auto-csr-approver-29552660-pgfdt\" (UID: \"ca109d6c-6ba5-44e6-abf5-506de51e0043\") " pod="openshift-infra/auto-csr-approver-29552660-pgfdt" Mar 10 16:20:00 crc kubenswrapper[4743]: I0310 16:20:00.491890 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552660-pgfdt" Mar 10 16:20:01 crc kubenswrapper[4743]: I0310 16:20:01.694589 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552660-pgfdt"] Mar 10 16:20:01 crc kubenswrapper[4743]: I0310 16:20:01.928242 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552660-pgfdt" event={"ID":"ca109d6c-6ba5-44e6-abf5-506de51e0043","Type":"ContainerStarted","Data":"a1bc8bf9d834de094036d2d377badc1c424d38fd22b674224eb3c5af83716a45"} Mar 10 16:20:02 crc kubenswrapper[4743]: I0310 16:20:02.916037 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:20:02 crc kubenswrapper[4743]: E0310 16:20:02.916364 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:20:04 crc kubenswrapper[4743]: I0310 16:20:04.951427 4743 generic.go:334] "Generic (PLEG): container finished" podID="ca109d6c-6ba5-44e6-abf5-506de51e0043" containerID="0065bb863118ec407866caac01b55a89e765fb8d34d91232220be5f176eb5713" exitCode=0 Mar 10 16:20:04 crc kubenswrapper[4743]: I0310 16:20:04.951545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552660-pgfdt" event={"ID":"ca109d6c-6ba5-44e6-abf5-506de51e0043","Type":"ContainerDied","Data":"0065bb863118ec407866caac01b55a89e765fb8d34d91232220be5f176eb5713"} Mar 10 16:20:06 crc kubenswrapper[4743]: I0310 16:20:06.454489 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552660-pgfdt" Mar 10 16:20:06 crc kubenswrapper[4743]: I0310 16:20:06.476580 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nlxn\" (UniqueName: \"kubernetes.io/projected/ca109d6c-6ba5-44e6-abf5-506de51e0043-kube-api-access-6nlxn\") pod \"ca109d6c-6ba5-44e6-abf5-506de51e0043\" (UID: \"ca109d6c-6ba5-44e6-abf5-506de51e0043\") " Mar 10 16:20:06 crc kubenswrapper[4743]: I0310 16:20:06.487199 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca109d6c-6ba5-44e6-abf5-506de51e0043-kube-api-access-6nlxn" (OuterVolumeSpecName: "kube-api-access-6nlxn") pod "ca109d6c-6ba5-44e6-abf5-506de51e0043" (UID: "ca109d6c-6ba5-44e6-abf5-506de51e0043"). InnerVolumeSpecName "kube-api-access-6nlxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:20:06 crc kubenswrapper[4743]: I0310 16:20:06.578996 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nlxn\" (UniqueName: \"kubernetes.io/projected/ca109d6c-6ba5-44e6-abf5-506de51e0043-kube-api-access-6nlxn\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:06 crc kubenswrapper[4743]: I0310 16:20:06.976698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552660-pgfdt" event={"ID":"ca109d6c-6ba5-44e6-abf5-506de51e0043","Type":"ContainerDied","Data":"a1bc8bf9d834de094036d2d377badc1c424d38fd22b674224eb3c5af83716a45"} Mar 10 16:20:06 crc kubenswrapper[4743]: I0310 16:20:06.976748 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1bc8bf9d834de094036d2d377badc1c424d38fd22b674224eb3c5af83716a45" Mar 10 16:20:06 crc kubenswrapper[4743]: I0310 16:20:06.976831 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552660-pgfdt" Mar 10 16:20:07 crc kubenswrapper[4743]: I0310 16:20:07.559870 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552654-2sknh"] Mar 10 16:20:07 crc kubenswrapper[4743]: I0310 16:20:07.572747 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552654-2sknh"] Mar 10 16:20:07 crc kubenswrapper[4743]: I0310 16:20:07.928103 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ad2354-c9aa-47ba-aff5-f37e3d71fcc1" path="/var/lib/kubelet/pods/37ad2354-c9aa-47ba-aff5-f37e3d71fcc1/volumes" Mar 10 16:20:14 crc kubenswrapper[4743]: I0310 16:20:14.916415 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:20:14 crc kubenswrapper[4743]: E0310 16:20:14.917282 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:20:17 crc kubenswrapper[4743]: I0310 16:20:17.094708 4743 scope.go:117] "RemoveContainer" containerID="dc9165777f504abf0567ac54c6dcd53f3615cab276877ea3f26aa9295cb35176" Mar 10 16:20:25 crc kubenswrapper[4743]: I0310 16:20:25.923941 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:20:25 crc kubenswrapper[4743]: E0310 16:20:25.926094 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:20:38 crc kubenswrapper[4743]: I0310 16:20:38.916530 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:20:38 crc kubenswrapper[4743]: E0310 16:20:38.918432 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:20:51 crc kubenswrapper[4743]: I0310 16:20:51.915934 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:20:51 crc kubenswrapper[4743]: E0310 16:20:51.916667 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:21:03 crc kubenswrapper[4743]: I0310 16:21:03.916107 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:21:03 crc kubenswrapper[4743]: E0310 16:21:03.917089 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:21:16 crc kubenswrapper[4743]: I0310 16:21:16.917692 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:21:16 crc kubenswrapper[4743]: E0310 16:21:16.918458 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:21:27 crc kubenswrapper[4743]: I0310 16:21:27.916218 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:21:27 crc kubenswrapper[4743]: E0310 16:21:27.917445 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:21:42 crc kubenswrapper[4743]: I0310 16:21:42.916432 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:21:42 crc kubenswrapper[4743]: E0310 16:21:42.917253 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:21:53 crc kubenswrapper[4743]: I0310 16:21:53.916018 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:21:53 crc kubenswrapper[4743]: E0310 16:21:53.916948 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.165496 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552662-clljw"] Mar 10 16:22:00 crc kubenswrapper[4743]: E0310 16:22:00.166618 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca109d6c-6ba5-44e6-abf5-506de51e0043" containerName="oc" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.166640 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca109d6c-6ba5-44e6-abf5-506de51e0043" containerName="oc" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.167072 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca109d6c-6ba5-44e6-abf5-506de51e0043" containerName="oc" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.168108 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552662-clljw" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.173229 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.173324 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.173907 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.193026 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552662-clljw"] Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.231401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxx4\" (UniqueName: \"kubernetes.io/projected/aebd1f52-4b59-4d6e-b603-b6615e63a7fa-kube-api-access-nnxx4\") pod \"auto-csr-approver-29552662-clljw\" (UID: \"aebd1f52-4b59-4d6e-b603-b6615e63a7fa\") " pod="openshift-infra/auto-csr-approver-29552662-clljw" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.333408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxx4\" (UniqueName: \"kubernetes.io/projected/aebd1f52-4b59-4d6e-b603-b6615e63a7fa-kube-api-access-nnxx4\") pod \"auto-csr-approver-29552662-clljw\" (UID: \"aebd1f52-4b59-4d6e-b603-b6615e63a7fa\") " pod="openshift-infra/auto-csr-approver-29552662-clljw" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.352209 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxx4\" (UniqueName: \"kubernetes.io/projected/aebd1f52-4b59-4d6e-b603-b6615e63a7fa-kube-api-access-nnxx4\") pod \"auto-csr-approver-29552662-clljw\" (UID: \"aebd1f52-4b59-4d6e-b603-b6615e63a7fa\") " pod="openshift-infra/auto-csr-approver-29552662-clljw" Mar 10 16:22:00 crc kubenswrapper[4743]: I0310 16:22:00.491991 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552662-clljw" Mar 10 16:22:01 crc kubenswrapper[4743]: I0310 16:22:01.012670 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552662-clljw"] Mar 10 16:22:01 crc kubenswrapper[4743]: I0310 16:22:01.053539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552662-clljw" event={"ID":"aebd1f52-4b59-4d6e-b603-b6615e63a7fa","Type":"ContainerStarted","Data":"fdee53a374c0972d78831452445a92434e3a0069511f604b4b25dd2992841a40"} Mar 10 16:22:03 crc kubenswrapper[4743]: I0310 16:22:03.074548 4743 generic.go:334] "Generic (PLEG): container finished" podID="aebd1f52-4b59-4d6e-b603-b6615e63a7fa" containerID="5c210c8062114e4335ff0d1abfb60d71a7ff31fe124c3e911038f78b7dc44a94" exitCode=0 Mar 10 16:22:03 crc kubenswrapper[4743]: I0310 16:22:03.074647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552662-clljw" event={"ID":"aebd1f52-4b59-4d6e-b603-b6615e63a7fa","Type":"ContainerDied","Data":"5c210c8062114e4335ff0d1abfb60d71a7ff31fe124c3e911038f78b7dc44a94"} Mar 10 16:22:04 crc kubenswrapper[4743]: I0310 16:22:04.588698 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552662-clljw" Mar 10 16:22:04 crc kubenswrapper[4743]: I0310 16:22:04.621089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnxx4\" (UniqueName: \"kubernetes.io/projected/aebd1f52-4b59-4d6e-b603-b6615e63a7fa-kube-api-access-nnxx4\") pod \"aebd1f52-4b59-4d6e-b603-b6615e63a7fa\" (UID: \"aebd1f52-4b59-4d6e-b603-b6615e63a7fa\") " Mar 10 16:22:04 crc kubenswrapper[4743]: I0310 16:22:04.659762 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aebd1f52-4b59-4d6e-b603-b6615e63a7fa-kube-api-access-nnxx4" (OuterVolumeSpecName: "kube-api-access-nnxx4") pod "aebd1f52-4b59-4d6e-b603-b6615e63a7fa" (UID: "aebd1f52-4b59-4d6e-b603-b6615e63a7fa"). InnerVolumeSpecName "kube-api-access-nnxx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:22:04 crc kubenswrapper[4743]: I0310 16:22:04.723509 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnxx4\" (UniqueName: \"kubernetes.io/projected/aebd1f52-4b59-4d6e-b603-b6615e63a7fa-kube-api-access-nnxx4\") on node \"crc\" DevicePath \"\"" Mar 10 16:22:05 crc kubenswrapper[4743]: I0310 16:22:05.095932 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552662-clljw" event={"ID":"aebd1f52-4b59-4d6e-b603-b6615e63a7fa","Type":"ContainerDied","Data":"fdee53a374c0972d78831452445a92434e3a0069511f604b4b25dd2992841a40"} Mar 10 16:22:05 crc kubenswrapper[4743]: I0310 16:22:05.095989 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdee53a374c0972d78831452445a92434e3a0069511f604b4b25dd2992841a40" Mar 10 16:22:05 crc kubenswrapper[4743]: I0310 16:22:05.096338 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552662-clljw" Mar 10 16:22:05 crc kubenswrapper[4743]: I0310 16:22:05.662169 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552656-gfgnw"] Mar 10 16:22:05 crc kubenswrapper[4743]: I0310 16:22:05.673134 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552656-gfgnw"] Mar 10 16:22:05 crc kubenswrapper[4743]: I0310 16:22:05.922600 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:22:05 crc kubenswrapper[4743]: E0310 16:22:05.923215 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:22:05 crc kubenswrapper[4743]: I0310 16:22:05.927364 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d2fdff-e1d9-4b41-b6c2-42489002e915" path="/var/lib/kubelet/pods/f6d2fdff-e1d9-4b41-b6c2-42489002e915/volumes" Mar 10 16:22:16 crc kubenswrapper[4743]: I0310 16:22:16.914782 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:22:16 crc kubenswrapper[4743]: E0310 16:22:16.915483 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:22:17 crc kubenswrapper[4743]: I0310 16:22:17.192123 4743 scope.go:117] "RemoveContainer" containerID="bf0a9f67ba185ecdeac1ad334a8e37ff2322ba9cdcf76e6c6ad59fb4a85c5de8" Mar 10 16:22:31 crc kubenswrapper[4743]: I0310 16:22:31.916500 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:22:31 crc kubenswrapper[4743]: E0310 16:22:31.917292 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.611247 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5szwn"] Mar 10 16:22:38 crc kubenswrapper[4743]: E0310 16:22:38.612348 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebd1f52-4b59-4d6e-b603-b6615e63a7fa" containerName="oc" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.612363 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebd1f52-4b59-4d6e-b603-b6615e63a7fa" containerName="oc" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.612571 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aebd1f52-4b59-4d6e-b603-b6615e63a7fa" containerName="oc" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.613968 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.622128 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5szwn"] Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.754062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-utilities\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.754408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5tj9\" (UniqueName: \"kubernetes.io/projected/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-kube-api-access-n5tj9\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.754493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-catalog-content\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.856217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-utilities\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.856282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5tj9\" (UniqueName: \"kubernetes.io/projected/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-kube-api-access-n5tj9\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.856349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-catalog-content\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.856964 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-catalog-content\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.856995 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-utilities\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.882866 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5tj9\" (UniqueName: \"kubernetes.io/projected/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-kube-api-access-n5tj9\") pod \"certified-operators-5szwn\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:38 crc kubenswrapper[4743]: I0310 16:22:38.941727 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:39 crc kubenswrapper[4743]: I0310 16:22:39.542985 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5szwn"] Mar 10 16:22:40 crc kubenswrapper[4743]: I0310 16:22:40.456994 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerID="d5c5b2f3b6be63699903e423e6fc976e0c8b428caef04032032b1eeaa0949425" exitCode=0 Mar 10 16:22:40 crc kubenswrapper[4743]: I0310 16:22:40.457173 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5szwn" event={"ID":"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f","Type":"ContainerDied","Data":"d5c5b2f3b6be63699903e423e6fc976e0c8b428caef04032032b1eeaa0949425"} Mar 10 16:22:40 crc kubenswrapper[4743]: I0310 16:22:40.457958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5szwn" event={"ID":"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f","Type":"ContainerStarted","Data":"78ef33eb5e83567ea10b8afd4fbcc40fcae31576bd8f2cbddfb606c85d45aa65"} Mar 10 16:22:42 crc kubenswrapper[4743]: I0310 16:22:42.477028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5szwn" event={"ID":"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f","Type":"ContainerStarted","Data":"85d4df0af593532fc2dd8b58f8879b87b4e7b291b7d90d7e328cb9eb92e2beeb"} Mar 10 16:22:42 crc kubenswrapper[4743]: I0310 16:22:42.915448 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:22:42 crc kubenswrapper[4743]: E0310 16:22:42.915932 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:22:43 crc kubenswrapper[4743]: I0310 16:22:43.492587 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerID="85d4df0af593532fc2dd8b58f8879b87b4e7b291b7d90d7e328cb9eb92e2beeb" exitCode=0 Mar 10 16:22:43 crc kubenswrapper[4743]: I0310 16:22:43.492668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5szwn" event={"ID":"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f","Type":"ContainerDied","Data":"85d4df0af593532fc2dd8b58f8879b87b4e7b291b7d90d7e328cb9eb92e2beeb"} Mar 10 16:22:43 crc kubenswrapper[4743]: I0310 16:22:43.495840 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:22:44 crc kubenswrapper[4743]: I0310 16:22:44.501422 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5szwn" event={"ID":"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f","Type":"ContainerStarted","Data":"4ca53a4df4db83c39a9beb20e83d03265072ae67c3c525c7b1ec7153d580b2a9"} Mar 10 16:22:44 crc kubenswrapper[4743]: I0310 16:22:44.522537 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5szwn" podStartSLOduration=2.945405526 podStartE2EDuration="6.522519497s" podCreationTimestamp="2026-03-10 16:22:38 +0000 UTC" firstStartedPulling="2026-03-10 16:22:40.459003273 +0000 UTC m=+4625.165818021" lastFinishedPulling="2026-03-10 16:22:44.036117234 +0000 UTC m=+4628.742931992" observedRunningTime="2026-03-10 16:22:44.515078635 +0000 UTC m=+4629.221893393" watchObservedRunningTime="2026-03-10 16:22:44.522519497 +0000 UTC m=+4629.229334245" Mar 10 16:22:48 crc kubenswrapper[4743]: I0310 16:22:48.942300 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:48 crc kubenswrapper[4743]: I0310 16:22:48.942947 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:49 crc kubenswrapper[4743]: I0310 16:22:49.009457 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:49 crc kubenswrapper[4743]: I0310 16:22:49.620410 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:49 crc kubenswrapper[4743]: I0310 16:22:49.679221 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5szwn"] Mar 10 16:22:51 crc kubenswrapper[4743]: I0310 16:22:51.567020 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5szwn" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerName="registry-server" containerID="cri-o://4ca53a4df4db83c39a9beb20e83d03265072ae67c3c525c7b1ec7153d580b2a9" gracePeriod=2 Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.590565 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerID="4ca53a4df4db83c39a9beb20e83d03265072ae67c3c525c7b1ec7153d580b2a9" exitCode=0 Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.590659 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5szwn" event={"ID":"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f","Type":"ContainerDied","Data":"4ca53a4df4db83c39a9beb20e83d03265072ae67c3c525c7b1ec7153d580b2a9"} Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.746774 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.842589 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-utilities\") pod \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.843089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-catalog-content\") pod \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.843151 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5tj9\" (UniqueName: \"kubernetes.io/projected/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-kube-api-access-n5tj9\") pod \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\" (UID: \"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f\") " Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.843699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-utilities" (OuterVolumeSpecName: "utilities") pod "0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" (UID: "0ed4ec49-f1d9-45e7-8d25-8dba8670a53f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.849339 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-kube-api-access-n5tj9" (OuterVolumeSpecName: "kube-api-access-n5tj9") pod "0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" (UID: "0ed4ec49-f1d9-45e7-8d25-8dba8670a53f"). InnerVolumeSpecName "kube-api-access-n5tj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.929461 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" (UID: "0ed4ec49-f1d9-45e7-8d25-8dba8670a53f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.946142 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.946181 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5tj9\" (UniqueName: \"kubernetes.io/projected/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-kube-api-access-n5tj9\") on node \"crc\" DevicePath \"\"" Mar 10 16:22:52 crc kubenswrapper[4743]: I0310 16:22:52.946200 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:22:53 crc kubenswrapper[4743]: I0310 16:22:53.601347 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5szwn" event={"ID":"0ed4ec49-f1d9-45e7-8d25-8dba8670a53f","Type":"ContainerDied","Data":"78ef33eb5e83567ea10b8afd4fbcc40fcae31576bd8f2cbddfb606c85d45aa65"} Mar 10 16:22:53 crc kubenswrapper[4743]: I0310 16:22:53.601416 4743 scope.go:117] "RemoveContainer" containerID="4ca53a4df4db83c39a9beb20e83d03265072ae67c3c525c7b1ec7153d580b2a9" Mar 10 16:22:53 crc kubenswrapper[4743]: I0310 16:22:53.601429 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5szwn" Mar 10 16:22:53 crc kubenswrapper[4743]: I0310 16:22:53.645993 4743 scope.go:117] "RemoveContainer" containerID="85d4df0af593532fc2dd8b58f8879b87b4e7b291b7d90d7e328cb9eb92e2beeb" Mar 10 16:22:53 crc kubenswrapper[4743]: I0310 16:22:53.652292 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5szwn"] Mar 10 16:22:53 crc kubenswrapper[4743]: I0310 16:22:53.665794 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5szwn"] Mar 10 16:22:53 crc kubenswrapper[4743]: I0310 16:22:53.688939 4743 scope.go:117] "RemoveContainer" containerID="d5c5b2f3b6be63699903e423e6fc976e0c8b428caef04032032b1eeaa0949425" Mar 10 16:22:53 crc kubenswrapper[4743]: I0310 16:22:53.924832 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" path="/var/lib/kubelet/pods/0ed4ec49-f1d9-45e7-8d25-8dba8670a53f/volumes" Mar 10 16:22:57 crc kubenswrapper[4743]: I0310 16:22:57.916352 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:22:57 crc kubenswrapper[4743]: E0310 16:22:57.916919 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:23:11 crc kubenswrapper[4743]: I0310 16:23:11.918429 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:23:11 crc kubenswrapper[4743]: E0310 16:23:11.919525 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:23:25 crc kubenswrapper[4743]: I0310 16:23:25.925563 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:23:25 crc kubenswrapper[4743]: E0310 16:23:25.926781 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:23:36 crc kubenswrapper[4743]: I0310 16:23:36.916125 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:23:36 crc kubenswrapper[4743]: E0310 16:23:36.917108 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:23:50 crc kubenswrapper[4743]: I0310 16:23:50.915105 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:23:51 crc kubenswrapper[4743]: I0310 16:23:51.187963 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"957f98a08ddd6714ecff509167edd29c6f257ddbc30383c819edc430ac24893f"} Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.165117 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552664-wzplb"] Mar 10 16:24:00 crc kubenswrapper[4743]: E0310 16:24:00.167398 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerName="extract-content" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.167488 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerName="extract-content" Mar 10 16:24:00 crc kubenswrapper[4743]: E0310 16:24:00.167551 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerName="registry-server" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.167605 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerName="registry-server" Mar 10 16:24:00 crc kubenswrapper[4743]: E0310 16:24:00.167668 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerName="extract-utilities" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.167719 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerName="extract-utilities" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.168071 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed4ec49-f1d9-45e7-8d25-8dba8670a53f" containerName="registry-server" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.169035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552664-wzplb" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.171633 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.171632 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.172721 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.185128 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552664-wzplb"] Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.259102 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qwhz\" (UniqueName: \"kubernetes.io/projected/a19da3dc-cca4-4b3f-9970-0f6b44098031-kube-api-access-8qwhz\") pod \"auto-csr-approver-29552664-wzplb\" (UID: \"a19da3dc-cca4-4b3f-9970-0f6b44098031\") " pod="openshift-infra/auto-csr-approver-29552664-wzplb" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.362248 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qwhz\" (UniqueName: \"kubernetes.io/projected/a19da3dc-cca4-4b3f-9970-0f6b44098031-kube-api-access-8qwhz\") pod \"auto-csr-approver-29552664-wzplb\" (UID: \"a19da3dc-cca4-4b3f-9970-0f6b44098031\") " pod="openshift-infra/auto-csr-approver-29552664-wzplb" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.383492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qwhz\" (UniqueName: \"kubernetes.io/projected/a19da3dc-cca4-4b3f-9970-0f6b44098031-kube-api-access-8qwhz\") pod \"auto-csr-approver-29552664-wzplb\" (UID: \"a19da3dc-cca4-4b3f-9970-0f6b44098031\") " pod="openshift-infra/auto-csr-approver-29552664-wzplb" Mar 10 16:24:00 crc kubenswrapper[4743]: I0310 16:24:00.499516 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552664-wzplb" Mar 10 16:24:01 crc kubenswrapper[4743]: I0310 16:24:01.023717 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552664-wzplb"] Mar 10 16:24:01 crc kubenswrapper[4743]: W0310 16:24:01.025641 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda19da3dc_cca4_4b3f_9970_0f6b44098031.slice/crio-16f33be1e0cad29bcd5694485dcce9529b09e672adc8562bfbecdc2a11836a93 WatchSource:0}: Error finding container 16f33be1e0cad29bcd5694485dcce9529b09e672adc8562bfbecdc2a11836a93: Status 404 returned error can't find the container with id 16f33be1e0cad29bcd5694485dcce9529b09e672adc8562bfbecdc2a11836a93 Mar 10 16:24:01 crc kubenswrapper[4743]: I0310 16:24:01.292084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552664-wzplb" event={"ID":"a19da3dc-cca4-4b3f-9970-0f6b44098031","Type":"ContainerStarted","Data":"16f33be1e0cad29bcd5694485dcce9529b09e672adc8562bfbecdc2a11836a93"} Mar 10 16:24:03 crc kubenswrapper[4743]: I0310 16:24:03.315107 4743 generic.go:334] "Generic (PLEG): container finished" podID="a19da3dc-cca4-4b3f-9970-0f6b44098031" containerID="1fc7feddb092cc3a7d5152f0c965a8c29ce3e3eb3079ea4c66a76fd008c47a8d" exitCode=0 Mar 10 16:24:03 crc kubenswrapper[4743]: I0310 16:24:03.315309 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552664-wzplb" event={"ID":"a19da3dc-cca4-4b3f-9970-0f6b44098031","Type":"ContainerDied","Data":"1fc7feddb092cc3a7d5152f0c965a8c29ce3e3eb3079ea4c66a76fd008c47a8d"} Mar 10 16:24:04 crc kubenswrapper[4743]: I0310 16:24:04.749163 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552664-wzplb" Mar 10 16:24:04 crc kubenswrapper[4743]: I0310 16:24:04.871126 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qwhz\" (UniqueName: \"kubernetes.io/projected/a19da3dc-cca4-4b3f-9970-0f6b44098031-kube-api-access-8qwhz\") pod \"a19da3dc-cca4-4b3f-9970-0f6b44098031\" (UID: \"a19da3dc-cca4-4b3f-9970-0f6b44098031\") " Mar 10 16:24:04 crc kubenswrapper[4743]: I0310 16:24:04.889710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19da3dc-cca4-4b3f-9970-0f6b44098031-kube-api-access-8qwhz" (OuterVolumeSpecName: "kube-api-access-8qwhz") pod "a19da3dc-cca4-4b3f-9970-0f6b44098031" (UID: "a19da3dc-cca4-4b3f-9970-0f6b44098031"). InnerVolumeSpecName "kube-api-access-8qwhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:24:04 crc kubenswrapper[4743]: I0310 16:24:04.974562 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qwhz\" (UniqueName: \"kubernetes.io/projected/a19da3dc-cca4-4b3f-9970-0f6b44098031-kube-api-access-8qwhz\") on node \"crc\" DevicePath \"\"" Mar 10 16:24:05 crc kubenswrapper[4743]: I0310 16:24:05.339675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552664-wzplb" event={"ID":"a19da3dc-cca4-4b3f-9970-0f6b44098031","Type":"ContainerDied","Data":"16f33be1e0cad29bcd5694485dcce9529b09e672adc8562bfbecdc2a11836a93"} Mar 10 16:24:05 crc kubenswrapper[4743]: I0310 16:24:05.339718 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16f33be1e0cad29bcd5694485dcce9529b09e672adc8562bfbecdc2a11836a93" Mar 10 16:24:05 crc kubenswrapper[4743]: I0310 16:24:05.339750 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552664-wzplb" Mar 10 16:24:05 crc kubenswrapper[4743]: I0310 16:24:05.832634 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552658-cfkqt"] Mar 10 16:24:05 crc kubenswrapper[4743]: I0310 16:24:05.841109 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552658-cfkqt"] Mar 10 16:24:05 crc kubenswrapper[4743]: I0310 16:24:05.934549 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111075af-468f-4ff1-a238-55f1cca69653" path="/var/lib/kubelet/pods/111075af-468f-4ff1-a238-55f1cca69653/volumes" Mar 10 16:24:17 crc kubenswrapper[4743]: I0310 16:24:17.339751 4743 scope.go:117] "RemoveContainer" containerID="4228407e53fd3ce0406e23671c6013945c39133d16ee50e6c950f3b74bdf0824" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.164252 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvq8n"] Mar 10 16:24:20 crc kubenswrapper[4743]: E0310 16:24:20.166345 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19da3dc-cca4-4b3f-9970-0f6b44098031" containerName="oc" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.166724 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19da3dc-cca4-4b3f-9970-0f6b44098031" containerName="oc" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.167188 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19da3dc-cca4-4b3f-9970-0f6b44098031" containerName="oc" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.169167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.186647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvq8n"] Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.205678 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-catalog-content\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.205778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-utilities\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.205828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4jj\" (UniqueName: \"kubernetes.io/projected/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-kube-api-access-kn4jj\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.307518 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-catalog-content\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.307580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-utilities\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.307602 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4jj\" (UniqueName: \"kubernetes.io/projected/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-kube-api-access-kn4jj\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.308392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-catalog-content\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.308608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-utilities\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.332393 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4jj\" (UniqueName: \"kubernetes.io/projected/3ebb183b-5e0b-4c6f-ad0d-96bd60832852-kube-api-access-kn4jj\") pod \"redhat-operators-zvq8n\" (UID: \"3ebb183b-5e0b-4c6f-ad0d-96bd60832852\") " pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.491554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:20 crc kubenswrapper[4743]: I0310 16:24:20.816606 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvq8n"] Mar 10 16:24:21 crc kubenswrapper[4743]: I0310 16:24:21.475520 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvq8n" event={"ID":"3ebb183b-5e0b-4c6f-ad0d-96bd60832852","Type":"ContainerStarted","Data":"c5e2d997b7d4a5826e9c3d7eb15324295febd4abf1a02ddaf0c8987c49afed48"} Mar 10 16:24:21 crc kubenswrapper[4743]: I0310 16:24:21.476157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvq8n" event={"ID":"3ebb183b-5e0b-4c6f-ad0d-96bd60832852","Type":"ContainerStarted","Data":"c1f1894809615603cbdd7f19f2900d2e3b9f750fa8f8ba618fda55f5a1505c84"} Mar 10 16:24:22 crc kubenswrapper[4743]: I0310 16:24:22.484374 4743 generic.go:334] "Generic (PLEG): container finished" podID="3ebb183b-5e0b-4c6f-ad0d-96bd60832852" containerID="c5e2d997b7d4a5826e9c3d7eb15324295febd4abf1a02ddaf0c8987c49afed48" exitCode=0 Mar 10 16:24:22 crc kubenswrapper[4743]: I0310 16:24:22.484474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvq8n" event={"ID":"3ebb183b-5e0b-4c6f-ad0d-96bd60832852","Type":"ContainerDied","Data":"c5e2d997b7d4a5826e9c3d7eb15324295febd4abf1a02ddaf0c8987c49afed48"} Mar 10 16:24:31 crc kubenswrapper[4743]: I0310 16:24:31.569154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvq8n" event={"ID":"3ebb183b-5e0b-4c6f-ad0d-96bd60832852","Type":"ContainerStarted","Data":"dfffd02ccaa4dde09be0d80af70ea75bdc648d07b4363dfaec1762e0b753b6bf"} Mar 10 16:24:33 crc kubenswrapper[4743]: I0310 16:24:33.594481 4743 generic.go:334] "Generic (PLEG): container finished" podID="3ebb183b-5e0b-4c6f-ad0d-96bd60832852" containerID="dfffd02ccaa4dde09be0d80af70ea75bdc648d07b4363dfaec1762e0b753b6bf" exitCode=0 Mar 10 16:24:33 crc kubenswrapper[4743]: I0310 16:24:33.594975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvq8n" event={"ID":"3ebb183b-5e0b-4c6f-ad0d-96bd60832852","Type":"ContainerDied","Data":"dfffd02ccaa4dde09be0d80af70ea75bdc648d07b4363dfaec1762e0b753b6bf"} Mar 10 16:24:34 crc kubenswrapper[4743]: I0310 16:24:34.610730 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvq8n" event={"ID":"3ebb183b-5e0b-4c6f-ad0d-96bd60832852","Type":"ContainerStarted","Data":"7881fdef79b9d4516f29acae5673706677984a812c31db8f9f81ac6279b262e4"} Mar 10 16:24:34 crc kubenswrapper[4743]: I0310 16:24:34.660305 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvq8n" podStartSLOduration=3.133431604 podStartE2EDuration="14.660271687s" podCreationTimestamp="2026-03-10 16:24:20 +0000 UTC" firstStartedPulling="2026-03-10 16:24:22.486378296 +0000 UTC m=+4727.193193044" lastFinishedPulling="2026-03-10 16:24:34.013218379 +0000 UTC m=+4738.720033127" observedRunningTime="2026-03-10 16:24:34.640810354 +0000 UTC m=+4739.347625152" watchObservedRunningTime="2026-03-10 16:24:34.660271687 +0000 UTC m=+4739.367086475" Mar 10 16:24:40 crc kubenswrapper[4743]: I0310 16:24:40.492233 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:40 crc kubenswrapper[4743]: I0310 16:24:40.494650 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:40 crc kubenswrapper[4743]: I0310 16:24:40.568756 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:40 crc kubenswrapper[4743]: I0310 16:24:40.734752 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvq8n" Mar 10 16:24:40 crc kubenswrapper[4743]: I0310 16:24:40.866842 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvq8n"] Mar 10 16:24:40 crc kubenswrapper[4743]: I0310 16:24:40.946140 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cvw9l"] Mar 10 16:24:40 crc kubenswrapper[4743]: I0310 16:24:40.946464 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cvw9l" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="registry-server" containerID="cri-o://8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd" gracePeriod=2 Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.510776 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.606988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-catalog-content\") pod \"190a3104-25c3-4135-bd63-b9e56380c9b9\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.607041 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp25n\" (UniqueName: \"kubernetes.io/projected/190a3104-25c3-4135-bd63-b9e56380c9b9-kube-api-access-fp25n\") pod \"190a3104-25c3-4135-bd63-b9e56380c9b9\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.607085 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-utilities\") pod \"190a3104-25c3-4135-bd63-b9e56380c9b9\" (UID: \"190a3104-25c3-4135-bd63-b9e56380c9b9\") " Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.610227 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-utilities" (OuterVolumeSpecName: "utilities") pod "190a3104-25c3-4135-bd63-b9e56380c9b9" (UID: "190a3104-25c3-4135-bd63-b9e56380c9b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.620212 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190a3104-25c3-4135-bd63-b9e56380c9b9-kube-api-access-fp25n" (OuterVolumeSpecName: "kube-api-access-fp25n") pod "190a3104-25c3-4135-bd63-b9e56380c9b9" (UID: "190a3104-25c3-4135-bd63-b9e56380c9b9"). InnerVolumeSpecName "kube-api-access-fp25n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.679860 4743 generic.go:334] "Generic (PLEG): container finished" podID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerID="8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd" exitCode=0 Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.680283 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvw9l" event={"ID":"190a3104-25c3-4135-bd63-b9e56380c9b9","Type":"ContainerDied","Data":"8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd"} Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.680372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvw9l" event={"ID":"190a3104-25c3-4135-bd63-b9e56380c9b9","Type":"ContainerDied","Data":"7aee3e398014285b016be28813ae9bfe6ea345f3a1ae1189d515c3cfdacd535a"} Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.680400 4743 scope.go:117] "RemoveContainer" containerID="8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.680405 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvw9l" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.705591 4743 scope.go:117] "RemoveContainer" containerID="a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.709189 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp25n\" (UniqueName: \"kubernetes.io/projected/190a3104-25c3-4135-bd63-b9e56380c9b9-kube-api-access-fp25n\") on node \"crc\" DevicePath \"\"" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.709213 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.742115 4743 scope.go:117] "RemoveContainer" containerID="707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.765479 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "190a3104-25c3-4135-bd63-b9e56380c9b9" (UID: "190a3104-25c3-4135-bd63-b9e56380c9b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.788513 4743 scope.go:117] "RemoveContainer" containerID="8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd" Mar 10 16:24:41 crc kubenswrapper[4743]: E0310 16:24:41.789190 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd\": container with ID starting with 8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd not found: ID does not exist" containerID="8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.789225 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd"} err="failed to get container status \"8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd\": rpc error: code = NotFound desc = could not find container \"8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd\": container with ID starting with 8dce3e1f2dfc9d1ff6d6981e88a941c8d39a6521afded8adb3730959bdce33dd not found: ID does not exist" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.789248 4743 scope.go:117] "RemoveContainer" containerID="a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec" Mar 10 16:24:41 crc kubenswrapper[4743]: E0310 16:24:41.789517 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec\": container with ID starting with a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec not found: ID does not exist" containerID="a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.789545 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec"} err="failed to get container status \"a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec\": rpc error: code = NotFound desc = could not find container \"a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec\": container with ID starting with a0aae53a2a42e607ce3c4c5287f90f93428ee1931ffff95cf2b610bff51ec2ec not found: ID does not exist" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.789560 4743 scope.go:117] "RemoveContainer" containerID="707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f" Mar 10 16:24:41 crc kubenswrapper[4743]: E0310 16:24:41.790160 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f\": container with ID starting with 707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f not found: ID does not exist" containerID="707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.790189 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f"} err="failed to get container status \"707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f\": rpc error: code = NotFound desc = could not find container \"707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f\": container with ID starting with 707cf0804e828494b31138aa5975734aea90566a5dd1a2ad197375df20dcff7f not found: ID does not exist" Mar 10 16:24:41 crc kubenswrapper[4743]: I0310 16:24:41.810963 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190a3104-25c3-4135-bd63-b9e56380c9b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:24:42 crc kubenswrapper[4743]: I0310 16:24:42.003526 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cvw9l"] Mar 10 16:24:42 crc kubenswrapper[4743]: I0310 16:24:42.012116 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cvw9l"] Mar 10 16:24:43 crc kubenswrapper[4743]: I0310 16:24:43.925921 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" path="/var/lib/kubelet/pods/190a3104-25c3-4135-bd63-b9e56380c9b9/volumes" Mar 10 16:25:15 crc kubenswrapper[4743]: I0310 16:25:15.229425 4743 generic.go:334] "Generic (PLEG): container finished" podID="fa680413-f368-421d-914c-1941e02c2c57" containerID="44dcc03ade38ccd23aaa0bde7d1b97d53489b3aea5d6eadc5f26a931bf4e0a97" exitCode=0 Mar 10 16:25:15 crc kubenswrapper[4743]: I0310 16:25:15.229631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fa680413-f368-421d-914c-1941e02c2c57","Type":"ContainerDied","Data":"44dcc03ade38ccd23aaa0bde7d1b97d53489b3aea5d6eadc5f26a931bf4e0a97"} Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.654121 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.787907 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbxg2\" (UniqueName: \"kubernetes.io/projected/fa680413-f368-421d-914c-1941e02c2c57-kube-api-access-jbxg2\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.788283 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-temporary\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.788343 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ssh-key\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.788388 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.788423 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-openstack-config\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.788452 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-workdir\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.788495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-openstack-config-secret\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.788522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ca-certs\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.788545 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-config-data\") pod \"fa680413-f368-421d-914c-1941e02c2c57\" (UID: \"fa680413-f368-421d-914c-1941e02c2c57\") " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.789735 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.793757 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-config-data" (OuterVolumeSpecName: "config-data") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.796186 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa680413-f368-421d-914c-1941e02c2c57-kube-api-access-jbxg2" (OuterVolumeSpecName: "kube-api-access-jbxg2") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "kube-api-access-jbxg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.800286 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.800551 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.820691 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.833224 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.836046 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.843067 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fa680413-f368-421d-914c-1941e02c2c57" (UID: "fa680413-f368-421d-914c-1941e02c2c57"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890568 4743 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890611 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890649 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890662 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890678 4743 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fa680413-f368-421d-914c-1941e02c2c57-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890690 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890704 4743 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fa680413-f368-421d-914c-1941e02c2c57-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890810 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa680413-f368-421d-914c-1941e02c2c57-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.890834 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbxg2\" (UniqueName: \"kubernetes.io/projected/fa680413-f368-421d-914c-1941e02c2c57-kube-api-access-jbxg2\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.917618 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 16:25:16 crc kubenswrapper[4743]: I0310 16:25:16.993316 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:25:17 crc kubenswrapper[4743]: I0310 16:25:17.251345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fa680413-f368-421d-914c-1941e02c2c57","Type":"ContainerDied","Data":"324c5203973df81b8a32781238e5d83fa4f17ccb03d1a5b91707bd19fe5a04f2"} Mar 10 16:25:17 crc kubenswrapper[4743]: I0310 16:25:17.251386 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="324c5203973df81b8a32781238e5d83fa4f17ccb03d1a5b91707bd19fe5a04f2" Mar 10 16:25:17 crc kubenswrapper[4743]: I0310 16:25:17.251412 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.757086 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 16:25:18 crc kubenswrapper[4743]: E0310 16:25:18.757883 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa680413-f368-421d-914c-1941e02c2c57" containerName="tempest-tests-tempest-tests-runner" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.757898 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa680413-f368-421d-914c-1941e02c2c57" containerName="tempest-tests-tempest-tests-runner" Mar 10 16:25:18 crc kubenswrapper[4743]: E0310 16:25:18.757925 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="extract-content" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.757932 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="extract-content" Mar 10 16:25:18 crc kubenswrapper[4743]: E0310 16:25:18.757944 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="registry-server" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.757950 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="registry-server" Mar 10 16:25:18 crc kubenswrapper[4743]: E0310 16:25:18.757971 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="extract-utilities" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.757978 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="extract-utilities" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.758188 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa680413-f368-421d-914c-1941e02c2c57" containerName="tempest-tests-tempest-tests-runner" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.758208 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="190a3104-25c3-4135-bd63-b9e56380c9b9" containerName="registry-server" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.758962 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.769569 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.935520 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxjp9\" (UniqueName: \"kubernetes.io/projected/e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862-kube-api-access-wxjp9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:18 crc kubenswrapper[4743]: I0310 16:25:18.935596 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:19 crc kubenswrapper[4743]: I0310 16:25:19.037167 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxjp9\" (UniqueName: \"kubernetes.io/projected/e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862-kube-api-access-wxjp9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:19 crc kubenswrapper[4743]: I0310 16:25:19.037213 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:19 crc kubenswrapper[4743]: I0310 16:25:19.037474 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:19 crc kubenswrapper[4743]: I0310 16:25:19.056561 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxjp9\" (UniqueName: \"kubernetes.io/projected/e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862-kube-api-access-wxjp9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:19 crc kubenswrapper[4743]: I0310 16:25:19.062990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:19 crc kubenswrapper[4743]: I0310 16:25:19.083787 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:25:19 crc kubenswrapper[4743]: I0310 16:25:19.517833 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 16:25:20 crc kubenswrapper[4743]: I0310 16:25:20.295897 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862","Type":"ContainerStarted","Data":"9a1dc87c4ae4a5a4df2002bd500ae17e8189d93a5c7e082cd2e04a4f6cdc816f"} Mar 10 16:25:21 crc kubenswrapper[4743]: I0310 16:25:21.350388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862","Type":"ContainerStarted","Data":"be6032dce609d661c69aa3e41eeccb045da07431710de0bfb5e312de211f667e"} Mar 10 16:25:21 crc kubenswrapper[4743]: I0310 16:25:21.378095 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.161895496 podStartE2EDuration="3.378072306s" podCreationTimestamp="2026-03-10 16:25:18 +0000 UTC" firstStartedPulling="2026-03-10 16:25:19.524593905 +0000 UTC m=+4784.231408663" lastFinishedPulling="2026-03-10 16:25:20.740770725 +0000 UTC m=+4785.447585473" observedRunningTime="2026-03-10 16:25:21.369279646 +0000 UTC m=+4786.076094394" watchObservedRunningTime="2026-03-10 16:25:21.378072306 +0000 UTC m=+4786.084887054" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.381943 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-87xtf/must-gather-75tqt"] Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.383972 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.385905 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-87xtf"/"kube-root-ca.crt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.386152 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-87xtf"/"openshift-service-ca.crt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.395219 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-87xtf/must-gather-75tqt"] Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.410999 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-87xtf"/"default-dockercfg-wdfqd" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.472529 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzhq6\" (UniqueName: \"kubernetes.io/projected/16ef51b2-326c-403e-996c-2791378770a3-kube-api-access-mzhq6\") pod \"must-gather-75tqt\" (UID: \"16ef51b2-326c-403e-996c-2791378770a3\") " pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.472775 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16ef51b2-326c-403e-996c-2791378770a3-must-gather-output\") pod \"must-gather-75tqt\" (UID: \"16ef51b2-326c-403e-996c-2791378770a3\") " pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.575247 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16ef51b2-326c-403e-996c-2791378770a3-must-gather-output\") pod \"must-gather-75tqt\" (UID: \"16ef51b2-326c-403e-996c-2791378770a3\") " pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.575352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzhq6\" (UniqueName: \"kubernetes.io/projected/16ef51b2-326c-403e-996c-2791378770a3-kube-api-access-mzhq6\") pod \"must-gather-75tqt\" (UID: \"16ef51b2-326c-403e-996c-2791378770a3\") " pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.576300 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16ef51b2-326c-403e-996c-2791378770a3-must-gather-output\") pod \"must-gather-75tqt\" (UID: \"16ef51b2-326c-403e-996c-2791378770a3\") " pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.602492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzhq6\" (UniqueName: \"kubernetes.io/projected/16ef51b2-326c-403e-996c-2791378770a3-kube-api-access-mzhq6\") pod \"must-gather-75tqt\" (UID: \"16ef51b2-326c-403e-996c-2791378770a3\") " pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:25:59 crc kubenswrapper[4743]: I0310 16:25:59.774058 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.157096 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552666-dnlgn"] Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.160338 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552666-dnlgn" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.164208 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.164879 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.165117 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.171583 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552666-dnlgn"] Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.189855 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccw6b\" (UniqueName: \"kubernetes.io/projected/e4dd393b-7153-474c-a81c-c07d0cb9d1db-kube-api-access-ccw6b\") pod \"auto-csr-approver-29552666-dnlgn\" (UID: \"e4dd393b-7153-474c-a81c-c07d0cb9d1db\") " pod="openshift-infra/auto-csr-approver-29552666-dnlgn" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.237414 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-87xtf/must-gather-75tqt"] Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.291936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccw6b\" (UniqueName: \"kubernetes.io/projected/e4dd393b-7153-474c-a81c-c07d0cb9d1db-kube-api-access-ccw6b\") pod \"auto-csr-approver-29552666-dnlgn\" (UID: \"e4dd393b-7153-474c-a81c-c07d0cb9d1db\") " pod="openshift-infra/auto-csr-approver-29552666-dnlgn" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.315473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccw6b\" (UniqueName: \"kubernetes.io/projected/e4dd393b-7153-474c-a81c-c07d0cb9d1db-kube-api-access-ccw6b\") pod \"auto-csr-approver-29552666-dnlgn\" (UID: \"e4dd393b-7153-474c-a81c-c07d0cb9d1db\") " pod="openshift-infra/auto-csr-approver-29552666-dnlgn" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.489595 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552666-dnlgn" Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.732020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/must-gather-75tqt" event={"ID":"16ef51b2-326c-403e-996c-2791378770a3","Type":"ContainerStarted","Data":"4086243131a774d1669362d2ad1fabe319ab35300fcea5209084e741f5bc4e94"} Mar 10 16:26:00 crc kubenswrapper[4743]: I0310 16:26:00.980536 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552666-dnlgn"] Mar 10 16:26:00 crc kubenswrapper[4743]: W0310 16:26:00.989190 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4dd393b_7153_474c_a81c_c07d0cb9d1db.slice/crio-54cd8ca2989ccb51abb4162ab287231ef8bcabbd5fdfa0ea87a28c8784a9b45b WatchSource:0}: Error finding container 54cd8ca2989ccb51abb4162ab287231ef8bcabbd5fdfa0ea87a28c8784a9b45b: Status 404 returned error can't find the container with id 54cd8ca2989ccb51abb4162ab287231ef8bcabbd5fdfa0ea87a28c8784a9b45b Mar 10 16:26:01 crc kubenswrapper[4743]: I0310 16:26:01.758291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552666-dnlgn" event={"ID":"e4dd393b-7153-474c-a81c-c07d0cb9d1db","Type":"ContainerStarted","Data":"54cd8ca2989ccb51abb4162ab287231ef8bcabbd5fdfa0ea87a28c8784a9b45b"} Mar 10 16:26:02 crc kubenswrapper[4743]: I0310 16:26:02.771594 4743 generic.go:334] "Generic (PLEG): container finished" podID="e4dd393b-7153-474c-a81c-c07d0cb9d1db" containerID="d1527afc00637b519ad9be826bf34320cfd01a53af49438848c06d81c19bb260" exitCode=0 Mar 10 16:26:02 crc kubenswrapper[4743]: I0310 16:26:02.771969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552666-dnlgn" event={"ID":"e4dd393b-7153-474c-a81c-c07d0cb9d1db","Type":"ContainerDied","Data":"d1527afc00637b519ad9be826bf34320cfd01a53af49438848c06d81c19bb260"} Mar 10 16:26:04 crc kubenswrapper[4743]: I0310 16:26:04.789796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552666-dnlgn" event={"ID":"e4dd393b-7153-474c-a81c-c07d0cb9d1db","Type":"ContainerDied","Data":"54cd8ca2989ccb51abb4162ab287231ef8bcabbd5fdfa0ea87a28c8784a9b45b"} Mar 10 16:26:04 crc kubenswrapper[4743]: I0310 16:26:04.790253 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54cd8ca2989ccb51abb4162ab287231ef8bcabbd5fdfa0ea87a28c8784a9b45b" Mar 10 16:26:04 crc kubenswrapper[4743]: I0310 16:26:04.834401 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552666-dnlgn" Mar 10 16:26:04 crc kubenswrapper[4743]: I0310 16:26:04.899122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccw6b\" (UniqueName: \"kubernetes.io/projected/e4dd393b-7153-474c-a81c-c07d0cb9d1db-kube-api-access-ccw6b\") pod \"e4dd393b-7153-474c-a81c-c07d0cb9d1db\" (UID: \"e4dd393b-7153-474c-a81c-c07d0cb9d1db\") " Mar 10 16:26:04 crc kubenswrapper[4743]: I0310 16:26:04.910106 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4dd393b-7153-474c-a81c-c07d0cb9d1db-kube-api-access-ccw6b" (OuterVolumeSpecName: "kube-api-access-ccw6b") pod "e4dd393b-7153-474c-a81c-c07d0cb9d1db" (UID: "e4dd393b-7153-474c-a81c-c07d0cb9d1db"). InnerVolumeSpecName "kube-api-access-ccw6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:26:05 crc kubenswrapper[4743]: I0310 16:26:05.002977 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccw6b\" (UniqueName: \"kubernetes.io/projected/e4dd393b-7153-474c-a81c-c07d0cb9d1db-kube-api-access-ccw6b\") on node \"crc\" DevicePath \"\"" Mar 10 16:26:05 crc kubenswrapper[4743]: I0310 16:26:05.801499 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552666-dnlgn" Mar 10 16:26:05 crc kubenswrapper[4743]: I0310 16:26:05.910565 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552660-pgfdt"] Mar 10 16:26:05 crc kubenswrapper[4743]: I0310 16:26:05.954596 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552660-pgfdt"] Mar 10 16:26:07 crc kubenswrapper[4743]: I0310 16:26:07.823745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/must-gather-75tqt" event={"ID":"16ef51b2-326c-403e-996c-2791378770a3","Type":"ContainerStarted","Data":"94741437692a7767a61752805f74806cec09d9eb745a9e200fb4843773457a15"} Mar 10 16:26:07 crc kubenswrapper[4743]: I0310 16:26:07.824358 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/must-gather-75tqt" event={"ID":"16ef51b2-326c-403e-996c-2791378770a3","Type":"ContainerStarted","Data":"308e38ba2a0c8143709f8ccc5005d67787ed76e1a65bb0b9d6bffdb8c9288aea"} Mar 10 16:26:07 crc kubenswrapper[4743]: I0310 16:26:07.848738 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-87xtf/must-gather-75tqt" podStartSLOduration=2.445837851 podStartE2EDuration="8.848721004s" podCreationTimestamp="2026-03-10 16:25:59 +0000 UTC" firstStartedPulling="2026-03-10 16:26:00.258996603 +0000 UTC m=+4824.965811351" lastFinishedPulling="2026-03-10 16:26:06.661879756 +0000 UTC m=+4831.368694504" observedRunningTime="2026-03-10 16:26:07.839313117 +0000 UTC m=+4832.546127875" watchObservedRunningTime="2026-03-10 16:26:07.848721004 +0000 UTC m=+4832.555535752" Mar 10 16:26:07 crc kubenswrapper[4743]: I0310 16:26:07.931643 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca109d6c-6ba5-44e6-abf5-506de51e0043" path="/var/lib/kubelet/pods/ca109d6c-6ba5-44e6-abf5-506de51e0043/volumes" Mar 10 16:26:11 crc kubenswrapper[4743]: I0310 16:26:11.252530 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:26:11 crc kubenswrapper[4743]: I0310 16:26:11.253136 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.099705 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-87xtf/crc-debug-crwnp"] Mar 10 16:26:13 crc kubenswrapper[4743]: E0310 16:26:13.111447 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4dd393b-7153-474c-a81c-c07d0cb9d1db" containerName="oc" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.111477 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4dd393b-7153-474c-a81c-c07d0cb9d1db" containerName="oc" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.111922 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4dd393b-7153-474c-a81c-c07d0cb9d1db" containerName="oc" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.112861 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.188329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspvb\" (UniqueName: \"kubernetes.io/projected/bf317c2d-465f-4a23-bb39-28e021fbfa35-kube-api-access-gspvb\") pod \"crc-debug-crwnp\" (UID: \"bf317c2d-465f-4a23-bb39-28e021fbfa35\") " pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.188638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf317c2d-465f-4a23-bb39-28e021fbfa35-host\") pod \"crc-debug-crwnp\" (UID: \"bf317c2d-465f-4a23-bb39-28e021fbfa35\") " pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.289984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf317c2d-465f-4a23-bb39-28e021fbfa35-host\") pod \"crc-debug-crwnp\" (UID: \"bf317c2d-465f-4a23-bb39-28e021fbfa35\") " pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.290142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gspvb\" (UniqueName: \"kubernetes.io/projected/bf317c2d-465f-4a23-bb39-28e021fbfa35-kube-api-access-gspvb\") pod \"crc-debug-crwnp\" (UID: \"bf317c2d-465f-4a23-bb39-28e021fbfa35\") " pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.290180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf317c2d-465f-4a23-bb39-28e021fbfa35-host\") pod \"crc-debug-crwnp\" (UID: \"bf317c2d-465f-4a23-bb39-28e021fbfa35\") " pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.313184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspvb\" (UniqueName: \"kubernetes.io/projected/bf317c2d-465f-4a23-bb39-28e021fbfa35-kube-api-access-gspvb\") pod \"crc-debug-crwnp\" (UID: \"bf317c2d-465f-4a23-bb39-28e021fbfa35\") " pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.448428 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:26:13 crc kubenswrapper[4743]: I0310 16:26:13.881934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/crc-debug-crwnp" event={"ID":"bf317c2d-465f-4a23-bb39-28e021fbfa35","Type":"ContainerStarted","Data":"2f72042e99c62a71e058b2c2e2392be0265191df52d3cbb432ecfc075f9690d4"} Mar 10 16:26:17 crc kubenswrapper[4743]: I0310 16:26:17.458427 4743 scope.go:117] "RemoveContainer" containerID="0065bb863118ec407866caac01b55a89e765fb8d34d91232220be5f176eb5713" Mar 10 16:26:25 crc kubenswrapper[4743]: I0310 16:26:25.010038 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/crc-debug-crwnp" event={"ID":"bf317c2d-465f-4a23-bb39-28e021fbfa35","Type":"ContainerStarted","Data":"479fcfcd456f84603df779cd452017c57731e8a2b56775d9817a0ce5c0cc927a"} Mar 10 16:26:25 crc kubenswrapper[4743]: I0310 16:26:25.032918 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-87xtf/crc-debug-crwnp" podStartSLOduration=0.812269653 podStartE2EDuration="12.032896654s" podCreationTimestamp="2026-03-10 16:26:13 +0000 UTC" firstStartedPulling="2026-03-10 16:26:13.492403373 +0000 UTC m=+4838.199218121" lastFinishedPulling="2026-03-10 16:26:24.713030374 +0000 UTC m=+4849.419845122" observedRunningTime="2026-03-10 16:26:25.023402414 +0000 UTC m=+4849.730217162" watchObservedRunningTime="2026-03-10 16:26:25.032896654 +0000 UTC m=+4849.739711402" Mar 10 16:26:41 crc kubenswrapper[4743]: I0310 16:26:41.252526 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:26:41 crc kubenswrapper[4743]: I0310 16:26:41.253249 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.252832 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.253346 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.253400 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.254153 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"957f98a08ddd6714ecff509167edd29c6f257ddbc30383c819edc430ac24893f"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.254198 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://957f98a08ddd6714ecff509167edd29c6f257ddbc30383c819edc430ac24893f" gracePeriod=600 Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.442523 4743 generic.go:334] "Generic (PLEG): container finished" podID="bf317c2d-465f-4a23-bb39-28e021fbfa35" containerID="479fcfcd456f84603df779cd452017c57731e8a2b56775d9817a0ce5c0cc927a" exitCode=0 Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.442869 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/crc-debug-crwnp" event={"ID":"bf317c2d-465f-4a23-bb39-28e021fbfa35","Type":"ContainerDied","Data":"479fcfcd456f84603df779cd452017c57731e8a2b56775d9817a0ce5c0cc927a"} Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.448090 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="957f98a08ddd6714ecff509167edd29c6f257ddbc30383c819edc430ac24893f" exitCode=0 Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.448128 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"957f98a08ddd6714ecff509167edd29c6f257ddbc30383c819edc430ac24893f"} Mar 10 16:27:11 crc kubenswrapper[4743]: I0310 16:27:11.448156 4743 scope.go:117] "RemoveContainer" containerID="6713659f86c0fa2b382b8e6097b0a3c547a02d4df1bdde7a9dc68bc9ea648862" Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.459220 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739"} Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.580638 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.625336 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-87xtf/crc-debug-crwnp"] Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.633613 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-87xtf/crc-debug-crwnp"] Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.671517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gspvb\" (UniqueName: \"kubernetes.io/projected/bf317c2d-465f-4a23-bb39-28e021fbfa35-kube-api-access-gspvb\") pod \"bf317c2d-465f-4a23-bb39-28e021fbfa35\" (UID: \"bf317c2d-465f-4a23-bb39-28e021fbfa35\") " Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.671838 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf317c2d-465f-4a23-bb39-28e021fbfa35-host\") pod \"bf317c2d-465f-4a23-bb39-28e021fbfa35\" (UID: \"bf317c2d-465f-4a23-bb39-28e021fbfa35\") " Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.672562 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf317c2d-465f-4a23-bb39-28e021fbfa35-host" (OuterVolumeSpecName: "host") pod "bf317c2d-465f-4a23-bb39-28e021fbfa35" (UID: "bf317c2d-465f-4a23-bb39-28e021fbfa35"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.677388 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf317c2d-465f-4a23-bb39-28e021fbfa35-kube-api-access-gspvb" (OuterVolumeSpecName: "kube-api-access-gspvb") pod "bf317c2d-465f-4a23-bb39-28e021fbfa35" (UID: "bf317c2d-465f-4a23-bb39-28e021fbfa35"). InnerVolumeSpecName "kube-api-access-gspvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.774133 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gspvb\" (UniqueName: \"kubernetes.io/projected/bf317c2d-465f-4a23-bb39-28e021fbfa35-kube-api-access-gspvb\") on node \"crc\" DevicePath \"\"" Mar 10 16:27:12 crc kubenswrapper[4743]: I0310 16:27:12.774176 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf317c2d-465f-4a23-bb39-28e021fbfa35-host\") on node \"crc\" DevicePath \"\"" Mar 10 16:27:13 crc kubenswrapper[4743]: I0310 16:27:13.471271 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f72042e99c62a71e058b2c2e2392be0265191df52d3cbb432ecfc075f9690d4" Mar 10 16:27:13 crc kubenswrapper[4743]: I0310 16:27:13.471303 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-crwnp" Mar 10 16:27:13 crc kubenswrapper[4743]: I0310 16:27:13.873455 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-87xtf/crc-debug-nkjr9"] Mar 10 16:27:13 crc kubenswrapper[4743]: E0310 16:27:13.875720 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf317c2d-465f-4a23-bb39-28e021fbfa35" containerName="container-00" Mar 10 16:27:13 crc kubenswrapper[4743]: I0310 16:27:13.875746 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf317c2d-465f-4a23-bb39-28e021fbfa35" containerName="container-00" Mar 10 16:27:13 crc kubenswrapper[4743]: I0310 16:27:13.878127 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf317c2d-465f-4a23-bb39-28e021fbfa35" containerName="container-00" Mar 10 16:27:13 crc kubenswrapper[4743]: I0310 16:27:13.879234 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:13 crc kubenswrapper[4743]: I0310 16:27:13.928873 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf317c2d-465f-4a23-bb39-28e021fbfa35" path="/var/lib/kubelet/pods/bf317c2d-465f-4a23-bb39-28e021fbfa35/volumes" Mar 10 16:27:14 crc kubenswrapper[4743]: I0310 16:27:14.004040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df87f19d-c67a-425c-aac1-b305b9d7cef5-host\") pod \"crc-debug-nkjr9\" (UID: \"df87f19d-c67a-425c-aac1-b305b9d7cef5\") " pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:14 crc kubenswrapper[4743]: I0310 16:27:14.004121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lkd8\" (UniqueName: \"kubernetes.io/projected/df87f19d-c67a-425c-aac1-b305b9d7cef5-kube-api-access-2lkd8\") pod \"crc-debug-nkjr9\" (UID: \"df87f19d-c67a-425c-aac1-b305b9d7cef5\") " pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:14 crc kubenswrapper[4743]: I0310 16:27:14.106428 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lkd8\" (UniqueName: \"kubernetes.io/projected/df87f19d-c67a-425c-aac1-b305b9d7cef5-kube-api-access-2lkd8\") pod \"crc-debug-nkjr9\" (UID: \"df87f19d-c67a-425c-aac1-b305b9d7cef5\") " pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:14 crc kubenswrapper[4743]: I0310 16:27:14.106703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df87f19d-c67a-425c-aac1-b305b9d7cef5-host\") pod \"crc-debug-nkjr9\" (UID: \"df87f19d-c67a-425c-aac1-b305b9d7cef5\") " pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:14 crc kubenswrapper[4743]: I0310 16:27:14.106830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df87f19d-c67a-425c-aac1-b305b9d7cef5-host\") pod \"crc-debug-nkjr9\" (UID: \"df87f19d-c67a-425c-aac1-b305b9d7cef5\") " pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:14 crc kubenswrapper[4743]: I0310 16:27:14.130431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lkd8\" (UniqueName: \"kubernetes.io/projected/df87f19d-c67a-425c-aac1-b305b9d7cef5-kube-api-access-2lkd8\") pod \"crc-debug-nkjr9\" (UID: \"df87f19d-c67a-425c-aac1-b305b9d7cef5\") " pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:14 crc kubenswrapper[4743]: I0310 16:27:14.202232 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:14 crc kubenswrapper[4743]: W0310 16:27:14.245283 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf87f19d_c67a_425c_aac1_b305b9d7cef5.slice/crio-3a10f7b4223118677c2e15ade875d5f2fa7ec20ab3288df5655364f75abacc5e WatchSource:0}: Error finding container 3a10f7b4223118677c2e15ade875d5f2fa7ec20ab3288df5655364f75abacc5e: Status 404 returned error can't find the container with id 3a10f7b4223118677c2e15ade875d5f2fa7ec20ab3288df5655364f75abacc5e Mar 10 16:27:14 crc kubenswrapper[4743]: I0310 16:27:14.482057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/crc-debug-nkjr9" event={"ID":"df87f19d-c67a-425c-aac1-b305b9d7cef5","Type":"ContainerStarted","Data":"3a10f7b4223118677c2e15ade875d5f2fa7ec20ab3288df5655364f75abacc5e"} Mar 10 16:27:15 crc kubenswrapper[4743]: I0310 16:27:15.491741 4743 generic.go:334] "Generic (PLEG): container finished" podID="df87f19d-c67a-425c-aac1-b305b9d7cef5" containerID="5e6181dfae88f446ffb5ec81617ca2aaed85ac07b50486ea013c0d21c5c3d8a4" exitCode=0 Mar 10 16:27:15 crc kubenswrapper[4743]: I0310 16:27:15.491790 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/crc-debug-nkjr9" event={"ID":"df87f19d-c67a-425c-aac1-b305b9d7cef5","Type":"ContainerDied","Data":"5e6181dfae88f446ffb5ec81617ca2aaed85ac07b50486ea013c0d21c5c3d8a4"} Mar 10 16:27:16 crc kubenswrapper[4743]: I0310 16:27:16.651737 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:16 crc kubenswrapper[4743]: I0310 16:27:16.751161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lkd8\" (UniqueName: \"kubernetes.io/projected/df87f19d-c67a-425c-aac1-b305b9d7cef5-kube-api-access-2lkd8\") pod \"df87f19d-c67a-425c-aac1-b305b9d7cef5\" (UID: \"df87f19d-c67a-425c-aac1-b305b9d7cef5\") " Mar 10 16:27:16 crc kubenswrapper[4743]: I0310 16:27:16.751245 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df87f19d-c67a-425c-aac1-b305b9d7cef5-host\") pod \"df87f19d-c67a-425c-aac1-b305b9d7cef5\" (UID: \"df87f19d-c67a-425c-aac1-b305b9d7cef5\") " Mar 10 16:27:16 crc kubenswrapper[4743]: I0310 16:27:16.751860 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df87f19d-c67a-425c-aac1-b305b9d7cef5-host" (OuterVolumeSpecName: "host") pod "df87f19d-c67a-425c-aac1-b305b9d7cef5" (UID: "df87f19d-c67a-425c-aac1-b305b9d7cef5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:27:16 crc kubenswrapper[4743]: I0310 16:27:16.759175 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df87f19d-c67a-425c-aac1-b305b9d7cef5-kube-api-access-2lkd8" (OuterVolumeSpecName: "kube-api-access-2lkd8") pod "df87f19d-c67a-425c-aac1-b305b9d7cef5" (UID: "df87f19d-c67a-425c-aac1-b305b9d7cef5"). InnerVolumeSpecName "kube-api-access-2lkd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:27:16 crc kubenswrapper[4743]: I0310 16:27:16.853917 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lkd8\" (UniqueName: \"kubernetes.io/projected/df87f19d-c67a-425c-aac1-b305b9d7cef5-kube-api-access-2lkd8\") on node \"crc\" DevicePath \"\"" Mar 10 16:27:16 crc kubenswrapper[4743]: I0310 16:27:16.853965 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df87f19d-c67a-425c-aac1-b305b9d7cef5-host\") on node \"crc\" DevicePath \"\"" Mar 10 16:27:17 crc kubenswrapper[4743]: I0310 16:27:17.525539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/crc-debug-nkjr9" event={"ID":"df87f19d-c67a-425c-aac1-b305b9d7cef5","Type":"ContainerDied","Data":"3a10f7b4223118677c2e15ade875d5f2fa7ec20ab3288df5655364f75abacc5e"} Mar 10 16:27:17 crc kubenswrapper[4743]: I0310 16:27:17.525596 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a10f7b4223118677c2e15ade875d5f2fa7ec20ab3288df5655364f75abacc5e" Mar 10 16:27:17 crc kubenswrapper[4743]: I0310 16:27:17.525605 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-nkjr9" Mar 10 16:27:18 crc kubenswrapper[4743]: I0310 16:27:18.379752 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-87xtf/crc-debug-nkjr9"] Mar 10 16:27:18 crc kubenswrapper[4743]: I0310 16:27:18.390764 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-87xtf/crc-debug-nkjr9"] Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.617489 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-87xtf/crc-debug-j8dt5"] Mar 10 16:27:19 crc kubenswrapper[4743]: E0310 16:27:19.618182 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df87f19d-c67a-425c-aac1-b305b9d7cef5" containerName="container-00" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.618205 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="df87f19d-c67a-425c-aac1-b305b9d7cef5" containerName="container-00" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.618568 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="df87f19d-c67a-425c-aac1-b305b9d7cef5" containerName="container-00" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.619947 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.717427 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxwq\" (UniqueName: \"kubernetes.io/projected/b31fa8da-9fdb-4145-a2ef-aff736dee403-kube-api-access-4dxwq\") pod \"crc-debug-j8dt5\" (UID: \"b31fa8da-9fdb-4145-a2ef-aff736dee403\") " pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.717584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b31fa8da-9fdb-4145-a2ef-aff736dee403-host\") pod \"crc-debug-j8dt5\" (UID: \"b31fa8da-9fdb-4145-a2ef-aff736dee403\") " pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.819525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxwq\" (UniqueName: \"kubernetes.io/projected/b31fa8da-9fdb-4145-a2ef-aff736dee403-kube-api-access-4dxwq\") pod \"crc-debug-j8dt5\" (UID: \"b31fa8da-9fdb-4145-a2ef-aff736dee403\") " pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.819598 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b31fa8da-9fdb-4145-a2ef-aff736dee403-host\") pod \"crc-debug-j8dt5\" (UID: \"b31fa8da-9fdb-4145-a2ef-aff736dee403\") " pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.819840 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b31fa8da-9fdb-4145-a2ef-aff736dee403-host\") pod \"crc-debug-j8dt5\" (UID: \"b31fa8da-9fdb-4145-a2ef-aff736dee403\") " pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.839032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxwq\" (UniqueName: \"kubernetes.io/projected/b31fa8da-9fdb-4145-a2ef-aff736dee403-kube-api-access-4dxwq\") pod \"crc-debug-j8dt5\" (UID: \"b31fa8da-9fdb-4145-a2ef-aff736dee403\") " pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.927237 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df87f19d-c67a-425c-aac1-b305b9d7cef5" path="/var/lib/kubelet/pods/df87f19d-c67a-425c-aac1-b305b9d7cef5/volumes" Mar 10 16:27:19 crc kubenswrapper[4743]: I0310 16:27:19.941490 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:20 crc kubenswrapper[4743]: I0310 16:27:20.560848 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/crc-debug-j8dt5" event={"ID":"b31fa8da-9fdb-4145-a2ef-aff736dee403","Type":"ContainerStarted","Data":"b21fb6ef9146b60c322adc36fbace663fdff6642ec8510068834972e0afc0832"} Mar 10 16:27:21 crc kubenswrapper[4743]: I0310 16:27:21.574115 4743 generic.go:334] "Generic (PLEG): container finished" podID="b31fa8da-9fdb-4145-a2ef-aff736dee403" containerID="7a2e8d37ca24365eddab091941742e45387833197037a0f56699f017529337d5" exitCode=0 Mar 10 16:27:21 crc kubenswrapper[4743]: I0310 16:27:21.574174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/crc-debug-j8dt5" event={"ID":"b31fa8da-9fdb-4145-a2ef-aff736dee403","Type":"ContainerDied","Data":"7a2e8d37ca24365eddab091941742e45387833197037a0f56699f017529337d5"} Mar 10 16:27:21 crc kubenswrapper[4743]: I0310 16:27:21.633873 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-87xtf/crc-debug-j8dt5"] Mar 10 16:27:21 crc kubenswrapper[4743]: I0310 16:27:21.647439 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-87xtf/crc-debug-j8dt5"] Mar 10 16:27:22 crc kubenswrapper[4743]: I0310 16:27:22.698965 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:22 crc kubenswrapper[4743]: I0310 16:27:22.783267 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b31fa8da-9fdb-4145-a2ef-aff736dee403-host\") pod \"b31fa8da-9fdb-4145-a2ef-aff736dee403\" (UID: \"b31fa8da-9fdb-4145-a2ef-aff736dee403\") " Mar 10 16:27:22 crc kubenswrapper[4743]: I0310 16:27:22.783646 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxwq\" (UniqueName: \"kubernetes.io/projected/b31fa8da-9fdb-4145-a2ef-aff736dee403-kube-api-access-4dxwq\") pod \"b31fa8da-9fdb-4145-a2ef-aff736dee403\" (UID: \"b31fa8da-9fdb-4145-a2ef-aff736dee403\") " Mar 10 16:27:22 crc kubenswrapper[4743]: I0310 16:27:22.783419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b31fa8da-9fdb-4145-a2ef-aff736dee403-host" (OuterVolumeSpecName: "host") pod "b31fa8da-9fdb-4145-a2ef-aff736dee403" (UID: "b31fa8da-9fdb-4145-a2ef-aff736dee403"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:27:22 crc kubenswrapper[4743]: I0310 16:27:22.784353 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b31fa8da-9fdb-4145-a2ef-aff736dee403-host\") on node \"crc\" DevicePath \"\"" Mar 10 16:27:22 crc kubenswrapper[4743]: I0310 16:27:22.789798 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31fa8da-9fdb-4145-a2ef-aff736dee403-kube-api-access-4dxwq" (OuterVolumeSpecName: "kube-api-access-4dxwq") pod "b31fa8da-9fdb-4145-a2ef-aff736dee403" (UID: "b31fa8da-9fdb-4145-a2ef-aff736dee403"). InnerVolumeSpecName "kube-api-access-4dxwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:27:22 crc kubenswrapper[4743]: I0310 16:27:22.886982 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dxwq\" (UniqueName: \"kubernetes.io/projected/b31fa8da-9fdb-4145-a2ef-aff736dee403-kube-api-access-4dxwq\") on node \"crc\" DevicePath \"\"" Mar 10 16:27:23 crc kubenswrapper[4743]: I0310 16:27:23.595422 4743 scope.go:117] "RemoveContainer" containerID="7a2e8d37ca24365eddab091941742e45387833197037a0f56699f017529337d5" Mar 10 16:27:23 crc kubenswrapper[4743]: I0310 16:27:23.595544 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/crc-debug-j8dt5" Mar 10 16:27:23 crc kubenswrapper[4743]: I0310 16:27:23.925904 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31fa8da-9fdb-4145-a2ef-aff736dee403" path="/var/lib/kubelet/pods/b31fa8da-9fdb-4145-a2ef-aff736dee403/volumes" Mar 10 16:27:39 crc kubenswrapper[4743]: I0310 16:27:39.654585 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85ffb9c4dd-pf4mm_a98f89f9-bf8d-4a56-868d-dba8bf4c56e7/barbican-api/0.log" Mar 10 16:27:39 crc kubenswrapper[4743]: I0310 16:27:39.832745 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85ffb9c4dd-pf4mm_a98f89f9-bf8d-4a56-868d-dba8bf4c56e7/barbican-api-log/0.log" Mar 10 16:27:39 crc kubenswrapper[4743]: I0310 16:27:39.872438 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7dbc4488d6-xzd4j_15e55638-f4f6-4dcb-944d-671e657f664e/barbican-keystone-listener/0.log" Mar 10 16:27:40 crc kubenswrapper[4743]: I0310 16:27:40.171053 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86d87df6b7-xvf5q_05d63a20-7909-4ff1-8416-f55b448633fb/barbican-worker/0.log" Mar 10 16:27:40 crc kubenswrapper[4743]: I0310 16:27:40.237408 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86d87df6b7-xvf5q_05d63a20-7909-4ff1-8416-f55b448633fb/barbican-worker-log/0.log" Mar 10 16:27:40 crc kubenswrapper[4743]: I0310 16:27:40.414306 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-nckf4_5806fcf8-1c71-408a-b87a-c4574daf14b6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:40 crc kubenswrapper[4743]: I0310 16:27:40.623888 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7dbc4488d6-xzd4j_15e55638-f4f6-4dcb-944d-671e657f664e/barbican-keystone-listener-log/0.log" Mar 10 16:27:40 crc kubenswrapper[4743]: I0310 16:27:40.695781 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5869f21f-5fc7-4837-b7e1-688cc16dc3ef/ceilometer-notification-agent/0.log" Mar 10 16:27:40 crc kubenswrapper[4743]: I0310 16:27:40.717455 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5869f21f-5fc7-4837-b7e1-688cc16dc3ef/ceilometer-central-agent/0.log" Mar 10 16:27:40 crc kubenswrapper[4743]: I0310 16:27:40.758985 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5869f21f-5fc7-4837-b7e1-688cc16dc3ef/proxy-httpd/0.log" Mar 10 16:27:40 crc kubenswrapper[4743]: I0310 16:27:40.814047 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5869f21f-5fc7-4837-b7e1-688cc16dc3ef/sg-core/0.log" Mar 10 16:27:41 crc kubenswrapper[4743]: I0310 16:27:41.083461 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_203a508a-4041-40ea-9a1b-bb6a706d9339/ceph/0.log" Mar 10 16:27:41 crc kubenswrapper[4743]: I0310 16:27:41.266001 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c5211b51-a212-41a2-9c1d-62e0029300e2/cinder-api/0.log" Mar 10 16:27:41 crc kubenswrapper[4743]: I0310 16:27:41.429791 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c5211b51-a212-41a2-9c1d-62e0029300e2/cinder-api-log/0.log" Mar 10 16:27:41 crc kubenswrapper[4743]: I0310 16:27:41.579868 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_42310a41-71eb-4eb8-bba6-938d1307270c/probe/0.log" Mar 10 16:27:41 crc kubenswrapper[4743]: I0310 16:27:41.684464 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_40933ebc-541e-4ae6-8280-372d05c43c3c/cinder-scheduler/0.log" Mar 10 16:27:41 crc kubenswrapper[4743]: I0310 16:27:41.704315 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_42310a41-71eb-4eb8-bba6-938d1307270c/cinder-backup/0.log" Mar 10 16:27:41 crc kubenswrapper[4743]: I0310 16:27:41.897922 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_40933ebc-541e-4ae6-8280-372d05c43c3c/probe/0.log" Mar 10 16:27:41 crc kubenswrapper[4743]: I0310 16:27:41.981053 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_df8b86ca-f690-4581-a74f-ab245f3b2479/probe/0.log" Mar 10 16:27:42 crc kubenswrapper[4743]: I0310 16:27:42.153829 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fw8dw_c1c83696-4c97-45ea-be7a-f635b349da0b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:42 crc kubenswrapper[4743]: I0310 16:27:42.410194 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-l22ct_273ce723-179f-468a-b890-3336be7763a0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:42 crc kubenswrapper[4743]: I0310 16:27:42.587524 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-qjn6h_122f314d-cff5-4699-9d4c-c5221b9174ba/init/0.log" Mar 10 16:27:42 crc kubenswrapper[4743]: I0310 16:27:42.876573 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-qjn6h_122f314d-cff5-4699-9d4c-c5221b9174ba/init/0.log" Mar 10 16:27:42 crc kubenswrapper[4743]: I0310 16:27:42.970643 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-qjn6h_122f314d-cff5-4699-9d4c-c5221b9174ba/dnsmasq-dns/0.log" Mar 10 16:27:43 crc kubenswrapper[4743]: I0310 16:27:43.129681 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6k57c_a14f160d-bf98-45da-a719-d8937e9281b0/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:43 crc kubenswrapper[4743]: I0310 16:27:43.208261 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2f464dd-fd03-486c-afac-a4e86a4e5226/glance-httpd/0.log" Mar 10 16:27:43 crc kubenswrapper[4743]: I0310 16:27:43.253774 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2f464dd-fd03-486c-afac-a4e86a4e5226/glance-log/0.log" Mar 10 16:27:43 crc kubenswrapper[4743]: I0310 16:27:43.498715 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ae118361-9b75-4d40-8145-fcefb244db30/glance-httpd/0.log" Mar 10 16:27:43 crc kubenswrapper[4743]: I0310 16:27:43.601295 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ae118361-9b75-4d40-8145-fcefb244db30/glance-log/0.log" Mar 10 16:27:43 crc kubenswrapper[4743]: I0310 16:27:43.888161 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-958fd895b-mxn2t_cccf05c8-d4e8-4a1d-912f-5f4a37440ac7/horizon/1.log" Mar 10 16:27:44 crc kubenswrapper[4743]: I0310 16:27:44.011698 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-958fd895b-mxn2t_cccf05c8-d4e8-4a1d-912f-5f4a37440ac7/horizon/0.log" Mar 10 16:27:44 crc kubenswrapper[4743]: I0310 16:27:44.115382 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-dxwlk_2a1df74b-db86-4291-9bbc-202200fb7f7b/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:44 crc kubenswrapper[4743]: I0310 16:27:44.302378 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qnj4p_5ac39670-4aeb-410f-a275-9b011cb8a21c/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:44 crc kubenswrapper[4743]: I0310 16:27:44.572763 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-958fd895b-mxn2t_cccf05c8-d4e8-4a1d-912f-5f4a37440ac7/horizon-log/0.log" Mar 10 16:27:44 crc kubenswrapper[4743]: I0310 16:27:44.684942 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29552641-qjhxn_87cf92f8-0e04-4abd-b668-40e17f635753/keystone-cron/0.log" Mar 10 16:27:44 crc kubenswrapper[4743]: I0310 16:27:44.827576 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_820c4321-0bbe-4413-bf70-80da56a68366/kube-state-metrics/0.log" Mar 10 16:27:44 crc kubenswrapper[4743]: I0310 16:27:44.867486 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_df8b86ca-f690-4581-a74f-ab245f3b2479/cinder-volume/0.log" Mar 10 16:27:45 crc kubenswrapper[4743]: I0310 16:27:45.144316 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fd24m_b982c5ef-116d-4e18-a707-768c7f0fbfc0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:45 crc kubenswrapper[4743]: I0310 16:27:45.600049 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aeb95b3f-66df-4abf-99e5-b18c24053075/manila-api/0.log" Mar 10 16:27:45 crc kubenswrapper[4743]: I0310 16:27:45.676964 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_807051a4-de7a-46e0-a230-f3e843c9ab76/manila-scheduler/0.log" Mar 10 16:27:45 crc kubenswrapper[4743]: I0310 16:27:45.689349 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_807051a4-de7a-46e0-a230-f3e843c9ab76/probe/0.log" Mar 10 16:27:45 crc kubenswrapper[4743]: I0310 16:27:45.946142 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_f94c3f98-f911-491e-bbae-5e5c8b3d0c10/probe/0.log" Mar 10 16:27:46 crc kubenswrapper[4743]: I0310 16:27:46.246298 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aeb95b3f-66df-4abf-99e5-b18c24053075/manila-api-log/0.log" Mar 10 16:27:46 crc kubenswrapper[4743]: I0310 16:27:46.285198 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_f94c3f98-f911-491e-bbae-5e5c8b3d0c10/manila-share/0.log" Mar 10 16:27:46 crc kubenswrapper[4743]: I0310 16:27:46.629967 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_532a024a-c011-4684-bff0-7d91932d8895/memcached/0.log" Mar 10 16:27:47 crc kubenswrapper[4743]: I0310 16:27:47.660051 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-w55p6_094b793a-799c-4f28-adb3-7caa9a6e732a/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:48 crc kubenswrapper[4743]: I0310 16:27:48.270222 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f8646897-kmnvw_7513fc87-13e7-4273-98eb-fda8dd8d0305/neutron-httpd/0.log" Mar 10 16:27:48 crc kubenswrapper[4743]: I0310 16:27:48.513530 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5bbb89db44-8df8j_c440f8ea-e5b3-49ea-a981-cf68bac5a2e5/keystone-api/0.log" Mar 10 16:27:49 crc kubenswrapper[4743]: I0310 16:27:49.094154 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f8646897-kmnvw_7513fc87-13e7-4273-98eb-fda8dd8d0305/neutron-api/0.log" Mar 10 16:27:49 crc kubenswrapper[4743]: I0310 16:27:49.153323 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fbcc9336-67de-48d7-8af4-bfb56233b69c/nova-cell0-conductor-conductor/0.log" Mar 10 16:27:49 crc kubenswrapper[4743]: I0310 16:27:49.329360 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_05ebbfdb-6c63-43a0-bd96-7e0adcb97221/nova-cell1-conductor-conductor/0.log" Mar 10 16:27:49 crc kubenswrapper[4743]: I0310 16:27:49.514991 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fe5569d1-9407-4a10-bfad-6a9f8f2b6e3d/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 16:27:49 crc kubenswrapper[4743]: I0310 16:27:49.622016 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-bmhcb_c93be7aa-1386-4f1b-9d49-eeb48c2e982c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:49 crc kubenswrapper[4743]: I0310 16:27:49.872271 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_608e6dfb-b1d2-4f61-93c1-28aa07052a5f/nova-metadata-log/0.log" Mar 10 16:27:50 crc kubenswrapper[4743]: I0310 16:27:50.311180 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_476447c9-b5d8-4f7d-a6a7-f1bff8302ced/nova-api-log/0.log" Mar 10 16:27:50 crc kubenswrapper[4743]: I0310 16:27:50.471859 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7659278c-be80-4660-a4df-a04d4a3bc888/mysql-bootstrap/0.log" Mar 10 16:27:50 crc kubenswrapper[4743]: I0310 16:27:50.541433 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5d521c59-429a-4612-b6f4-fbc32204a748/nova-scheduler-scheduler/0.log" Mar 10 16:27:50 crc kubenswrapper[4743]: I0310 16:27:50.619148 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7659278c-be80-4660-a4df-a04d4a3bc888/mysql-bootstrap/0.log" Mar 10 16:27:50 crc kubenswrapper[4743]: I0310 16:27:50.722915 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7659278c-be80-4660-a4df-a04d4a3bc888/galera/0.log" Mar 10 16:27:50 crc kubenswrapper[4743]: I0310 16:27:50.842342 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f7917171-7630-46e9-9ada-f7072a5fd530/mysql-bootstrap/0.log" Mar 10 16:27:50 crc kubenswrapper[4743]: I0310 16:27:50.943249 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_476447c9-b5d8-4f7d-a6a7-f1bff8302ced/nova-api-api/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.028507 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f7917171-7630-46e9-9ada-f7072a5fd530/mysql-bootstrap/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.143186 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f7917171-7630-46e9-9ada-f7072a5fd530/galera/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.220617 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_da9eebbd-7a82-441b-8ca6-14657357a1f0/openstackclient/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.347632 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-djcxl_f838c354-e379-48b7-8c2e-e295ec8c135b/openstack-network-exporter/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.432457 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_608e6dfb-b1d2-4f61-93c1-28aa07052a5f/nova-metadata-metadata/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.436144 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2xslw_bcb12ceb-d1b1-4c62-a718-602c0070e84c/ovsdb-server-init/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.618731 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2xslw_bcb12ceb-d1b1-4c62-a718-602c0070e84c/ovsdb-server-init/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.659502 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2xslw_bcb12ceb-d1b1-4c62-a718-602c0070e84c/ovsdb-server/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.663048 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2xslw_bcb12ceb-d1b1-4c62-a718-602c0070e84c/ovs-vswitchd/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.691917 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-x7xkr_b4e9ac77-eb37-4b1b-a209-8d4e8ce46c64/ovn-controller/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.891597 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-pmgm5_b481f000-abd5-4ee6-8b39-301e66c22f2a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.928661 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3431d016-101b-4513-8027-33805ae14fce/openstack-network-exporter/0.log" Mar 10 16:27:51 crc kubenswrapper[4743]: I0310 16:27:51.967952 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3431d016-101b-4513-8027-33805ae14fce/ovn-northd/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.074466 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d73b4e03-5996-41fd-8450-63b8de9e9f2e/openstack-network-exporter/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.130937 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d73b4e03-5996-41fd-8450-63b8de9e9f2e/ovsdbserver-nb/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.168970 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_688394f8-79dd-4922-9a2c-fbd9ffbb3a28/openstack-network-exporter/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.246134 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_688394f8-79dd-4922-9a2c-fbd9ffbb3a28/ovsdbserver-sb/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.499716 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b741008c-73ba-4516-bb63-05b066d7051b/setup-container/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.656994 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64b84f4b48-6qhqj_99416e79-6415-4c9d-93b0-920307f57e4c/placement-api/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.742468 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64b84f4b48-6qhqj_99416e79-6415-4c9d-93b0-920307f57e4c/placement-log/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.743535 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b741008c-73ba-4516-bb63-05b066d7051b/setup-container/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.759322 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b741008c-73ba-4516-bb63-05b066d7051b/rabbitmq/0.log" Mar 10 16:27:52 crc kubenswrapper[4743]: I0310 16:27:52.872820 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e44585ef-8ab2-45e9-a4f3-f333629f433a/setup-container/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.060150 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pqq9n_85abb06e-1162-44ac-93bf-9db7fb08a980/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.069651 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e44585ef-8ab2-45e9-a4f3-f333629f433a/rabbitmq/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.120634 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e44585ef-8ab2-45e9-a4f3-f333629f433a/setup-container/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.298560 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7kr4h_1c7d6ed2-094e-4726-b1b3-6238cb505f5e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.324617 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zdbhq_dc9f4770-25d4-4119-b914-9cffc9049566/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.370509 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bgq9p_ca697d29-a195-499a-8e64-1688d6748d0c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.543130 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-twkdg_8c7fcae9-f7ce-458c-ba25-87d2e932de62/ssh-known-hosts-edpm-deployment/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.647014 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7457b8c8b7-rzp6b_5d2c4fc9-7b3d-457e-af7d-52e1cda83b53/proxy-server/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.761338 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7457b8c8b7-rzp6b_5d2c4fc9-7b3d-457e-af7d-52e1cda83b53/proxy-httpd/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.765720 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-lbfhc_1f2a6755-0e08-482b-9815-88840f35fb4e/swift-ring-rebalance/0.log" Mar 10 16:27:53 crc kubenswrapper[4743]: I0310 16:27:53.882891 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/account-auditor/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.024448 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/account-reaper/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.045061 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/account-server/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.046452 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/account-replicator/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.141507 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/container-auditor/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.175206 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/container-replicator/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.242694 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/container-server/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.268227 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/container-updater/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.291720 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/object-auditor/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.351041 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/object-expirer/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.434822 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/object-updater/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.435344 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/object-replicator/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.502443 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/object-server/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.502905 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/rsync/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.577924 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05770cd2-4275-4fcc-bd98-f8951c4d91ba/swift-recon-cron/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.701761 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-swbc5_afce2ed9-7b72-4bee-a5f1-689f9f6888d8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.798742 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_fa680413-f368-421d-914c-1941e02c2c57/tempest-tests-tempest-tests-runner/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.907035 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e2bbadc2-6a1f-4f73-9bfb-d7380fd2c862/test-operator-logs-container/0.log" Mar 10 16:27:54 crc kubenswrapper[4743]: I0310 16:27:54.988941 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-p6wtf_7d2cd554-fedb-43fb-aefd-0f82a6c265e4/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:27:57 crc kubenswrapper[4743]: I0310 16:27:57.860819 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-65gjt"] Mar 10 16:27:57 crc kubenswrapper[4743]: E0310 16:27:57.861520 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31fa8da-9fdb-4145-a2ef-aff736dee403" containerName="container-00" Mar 10 16:27:57 crc kubenswrapper[4743]: I0310 16:27:57.861532 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31fa8da-9fdb-4145-a2ef-aff736dee403" containerName="container-00" Mar 10 16:27:57 crc kubenswrapper[4743]: I0310 16:27:57.861722 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31fa8da-9fdb-4145-a2ef-aff736dee403" containerName="container-00" Mar 10 16:27:57 crc kubenswrapper[4743]: I0310 16:27:57.863483 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:57 crc kubenswrapper[4743]: I0310 16:27:57.880561 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65gjt"] Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.001044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-utilities\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.001268 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-catalog-content\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.001309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw444\" (UniqueName: \"kubernetes.io/projected/34f73ddb-d3fb-436f-b81c-48a75234a43b-kube-api-access-gw444\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.103636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-catalog-content\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.103685 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw444\" (UniqueName: \"kubernetes.io/projected/34f73ddb-d3fb-436f-b81c-48a75234a43b-kube-api-access-gw444\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.103766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-utilities\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.104203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-catalog-content\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.104285 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-utilities\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.123859 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw444\" (UniqueName: \"kubernetes.io/projected/34f73ddb-d3fb-436f-b81c-48a75234a43b-kube-api-access-gw444\") pod \"redhat-marketplace-65gjt\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.201701 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:27:58 crc kubenswrapper[4743]: I0310 16:27:58.781573 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65gjt"] Mar 10 16:27:59 crc kubenswrapper[4743]: I0310 16:27:59.177791 4743 generic.go:334] "Generic (PLEG): container finished" podID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerID="a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904" exitCode=0 Mar 10 16:27:59 crc kubenswrapper[4743]: I0310 16:27:59.177859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65gjt" event={"ID":"34f73ddb-d3fb-436f-b81c-48a75234a43b","Type":"ContainerDied","Data":"a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904"} Mar 10 16:27:59 crc kubenswrapper[4743]: I0310 16:27:59.178258 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65gjt" event={"ID":"34f73ddb-d3fb-436f-b81c-48a75234a43b","Type":"ContainerStarted","Data":"a293c08af093e48032c51e99470c25d089ee88feff7a95337b768aff00c69805"} Mar 10 16:27:59 crc kubenswrapper[4743]: I0310 16:27:59.181028 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.157595 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552668-n92hv"] Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.159137 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552668-n92hv" Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.161294 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.162666 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.167422 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.197913 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552668-n92hv"] Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.247993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpbd6\" (UniqueName: \"kubernetes.io/projected/403b250a-af4e-4306-9544-a821adfaf9d8-kube-api-access-xpbd6\") pod \"auto-csr-approver-29552668-n92hv\" (UID: \"403b250a-af4e-4306-9544-a821adfaf9d8\") " pod="openshift-infra/auto-csr-approver-29552668-n92hv" Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.350496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpbd6\" (UniqueName: \"kubernetes.io/projected/403b250a-af4e-4306-9544-a821adfaf9d8-kube-api-access-xpbd6\") pod \"auto-csr-approver-29552668-n92hv\" (UID: \"403b250a-af4e-4306-9544-a821adfaf9d8\") " pod="openshift-infra/auto-csr-approver-29552668-n92hv" Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.374858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpbd6\" (UniqueName: \"kubernetes.io/projected/403b250a-af4e-4306-9544-a821adfaf9d8-kube-api-access-xpbd6\") pod \"auto-csr-approver-29552668-n92hv\" (UID: \"403b250a-af4e-4306-9544-a821adfaf9d8\") " pod="openshift-infra/auto-csr-approver-29552668-n92hv" Mar 10 16:28:00 crc kubenswrapper[4743]: I0310 16:28:00.503021 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552668-n92hv" Mar 10 16:28:01 crc kubenswrapper[4743]: I0310 16:28:01.026747 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552668-n92hv"] Mar 10 16:28:01 crc kubenswrapper[4743]: I0310 16:28:01.204535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552668-n92hv" event={"ID":"403b250a-af4e-4306-9544-a821adfaf9d8","Type":"ContainerStarted","Data":"4096c5b61a804fef7b6261a04a76187c257ca54f599e7f77ae21a4de3c1bcd37"} Mar 10 16:28:01 crc kubenswrapper[4743]: I0310 16:28:01.206218 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65gjt" event={"ID":"34f73ddb-d3fb-436f-b81c-48a75234a43b","Type":"ContainerStarted","Data":"082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078"} Mar 10 16:28:02 crc kubenswrapper[4743]: I0310 16:28:02.216396 4743 generic.go:334] "Generic (PLEG): container finished" podID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerID="082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078" exitCode=0 Mar 10 16:28:02 crc kubenswrapper[4743]: I0310 16:28:02.216478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65gjt" event={"ID":"34f73ddb-d3fb-436f-b81c-48a75234a43b","Type":"ContainerDied","Data":"082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078"} Mar 10 16:28:03 crc kubenswrapper[4743]: I0310 16:28:03.231962 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65gjt" event={"ID":"34f73ddb-d3fb-436f-b81c-48a75234a43b","Type":"ContainerStarted","Data":"9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4"} Mar 10 16:28:03 crc kubenswrapper[4743]: I0310 16:28:03.238413 4743 generic.go:334] "Generic (PLEG): container finished" podID="403b250a-af4e-4306-9544-a821adfaf9d8" containerID="27bb71a669b1af44d2518310d2044069aebde082f618afa0b6d839e2580b459f" exitCode=0 Mar 10 16:28:03 crc kubenswrapper[4743]: I0310 16:28:03.238466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552668-n92hv" event={"ID":"403b250a-af4e-4306-9544-a821adfaf9d8","Type":"ContainerDied","Data":"27bb71a669b1af44d2518310d2044069aebde082f618afa0b6d839e2580b459f"} Mar 10 16:28:03 crc kubenswrapper[4743]: I0310 16:28:03.255535 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-65gjt" podStartSLOduration=2.543927805 podStartE2EDuration="6.255510629s" podCreationTimestamp="2026-03-10 16:27:57 +0000 UTC" firstStartedPulling="2026-03-10 16:27:59.180665092 +0000 UTC m=+4943.887479840" lastFinishedPulling="2026-03-10 16:28:02.892247916 +0000 UTC m=+4947.599062664" observedRunningTime="2026-03-10 16:28:03.253414729 +0000 UTC m=+4947.960229477" watchObservedRunningTime="2026-03-10 16:28:03.255510629 +0000 UTC m=+4947.962325377" Mar 10 16:28:04 crc kubenswrapper[4743]: I0310 16:28:04.664942 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552668-n92hv" Mar 10 16:28:04 crc kubenswrapper[4743]: I0310 16:28:04.788198 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpbd6\" (UniqueName: \"kubernetes.io/projected/403b250a-af4e-4306-9544-a821adfaf9d8-kube-api-access-xpbd6\") pod \"403b250a-af4e-4306-9544-a821adfaf9d8\" (UID: \"403b250a-af4e-4306-9544-a821adfaf9d8\") " Mar 10 16:28:04 crc kubenswrapper[4743]: I0310 16:28:04.806117 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403b250a-af4e-4306-9544-a821adfaf9d8-kube-api-access-xpbd6" (OuterVolumeSpecName: "kube-api-access-xpbd6") pod "403b250a-af4e-4306-9544-a821adfaf9d8" (UID: "403b250a-af4e-4306-9544-a821adfaf9d8"). InnerVolumeSpecName "kube-api-access-xpbd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:28:04 crc kubenswrapper[4743]: I0310 16:28:04.890480 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpbd6\" (UniqueName: \"kubernetes.io/projected/403b250a-af4e-4306-9544-a821adfaf9d8-kube-api-access-xpbd6\") on node \"crc\" DevicePath \"\"" Mar 10 16:28:05 crc kubenswrapper[4743]: I0310 16:28:05.257121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552668-n92hv" event={"ID":"403b250a-af4e-4306-9544-a821adfaf9d8","Type":"ContainerDied","Data":"4096c5b61a804fef7b6261a04a76187c257ca54f599e7f77ae21a4de3c1bcd37"} Mar 10 16:28:05 crc kubenswrapper[4743]: I0310 16:28:05.257174 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4096c5b61a804fef7b6261a04a76187c257ca54f599e7f77ae21a4de3c1bcd37" Mar 10 16:28:05 crc kubenswrapper[4743]: I0310 16:28:05.257240 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552668-n92hv" Mar 10 16:28:05 crc kubenswrapper[4743]: I0310 16:28:05.745665 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552662-clljw"] Mar 10 16:28:05 crc kubenswrapper[4743]: I0310 16:28:05.756958 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552662-clljw"] Mar 10 16:28:05 crc kubenswrapper[4743]: I0310 16:28:05.926714 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aebd1f52-4b59-4d6e-b603-b6615e63a7fa" path="/var/lib/kubelet/pods/aebd1f52-4b59-4d6e-b603-b6615e63a7fa/volumes" Mar 10 16:28:08 crc kubenswrapper[4743]: I0310 16:28:08.203062 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:28:08 crc kubenswrapper[4743]: I0310 16:28:08.203431 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:28:08 crc kubenswrapper[4743]: I0310 16:28:08.261317 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:28:08 crc kubenswrapper[4743]: I0310 16:28:08.360974 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:28:08 crc kubenswrapper[4743]: I0310 16:28:08.494751 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65gjt"] Mar 10 16:28:10 crc kubenswrapper[4743]: I0310 16:28:10.320072 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-65gjt" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerName="registry-server" containerID="cri-o://9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4" gracePeriod=2 Mar 10 16:28:10 crc kubenswrapper[4743]: I0310 16:28:10.792030 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:28:10 crc kubenswrapper[4743]: I0310 16:28:10.920216 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-catalog-content\") pod \"34f73ddb-d3fb-436f-b81c-48a75234a43b\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " Mar 10 16:28:10 crc kubenswrapper[4743]: I0310 16:28:10.921122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw444\" (UniqueName: \"kubernetes.io/projected/34f73ddb-d3fb-436f-b81c-48a75234a43b-kube-api-access-gw444\") pod \"34f73ddb-d3fb-436f-b81c-48a75234a43b\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " Mar 10 16:28:10 crc kubenswrapper[4743]: I0310 16:28:10.921172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-utilities\") pod \"34f73ddb-d3fb-436f-b81c-48a75234a43b\" (UID: \"34f73ddb-d3fb-436f-b81c-48a75234a43b\") " Mar 10 16:28:10 crc kubenswrapper[4743]: I0310 16:28:10.921946 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-utilities" (OuterVolumeSpecName: "utilities") pod "34f73ddb-d3fb-436f-b81c-48a75234a43b" (UID: "34f73ddb-d3fb-436f-b81c-48a75234a43b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:28:10 crc kubenswrapper[4743]: I0310 16:28:10.942869 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f73ddb-d3fb-436f-b81c-48a75234a43b-kube-api-access-gw444" (OuterVolumeSpecName: "kube-api-access-gw444") pod "34f73ddb-d3fb-436f-b81c-48a75234a43b" (UID: "34f73ddb-d3fb-436f-b81c-48a75234a43b"). InnerVolumeSpecName "kube-api-access-gw444". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.023832 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw444\" (UniqueName: \"kubernetes.io/projected/34f73ddb-d3fb-436f-b81c-48a75234a43b-kube-api-access-gw444\") on node \"crc\" DevicePath \"\"" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.023866 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.156760 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34f73ddb-d3fb-436f-b81c-48a75234a43b" (UID: "34f73ddb-d3fb-436f-b81c-48a75234a43b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.227966 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f73ddb-d3fb-436f-b81c-48a75234a43b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.332524 4743 generic.go:334] "Generic (PLEG): container finished" podID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerID="9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4" exitCode=0 Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.332564 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65gjt" event={"ID":"34f73ddb-d3fb-436f-b81c-48a75234a43b","Type":"ContainerDied","Data":"9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4"} Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.332584 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65gjt" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.332606 4743 scope.go:117] "RemoveContainer" containerID="9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.332593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65gjt" event={"ID":"34f73ddb-d3fb-436f-b81c-48a75234a43b","Type":"ContainerDied","Data":"a293c08af093e48032c51e99470c25d089ee88feff7a95337b768aff00c69805"} Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.357776 4743 scope.go:117] "RemoveContainer" containerID="082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.371272 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65gjt"] Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.381831 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-65gjt"] Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.397992 4743 scope.go:117] "RemoveContainer" containerID="a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.459719 4743 scope.go:117] "RemoveContainer" containerID="9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4" Mar 10 16:28:11 crc kubenswrapper[4743]: E0310 16:28:11.460542 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4\": container with ID starting with 9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4 not found: ID does not exist" containerID="9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.460607 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4"} err="failed to get container status \"9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4\": rpc error: code = NotFound desc = could not find container \"9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4\": container with ID starting with 9914ddbe8e9271eb83329afe0283db3d8712c1735c48907d25968583b2812de4 not found: ID does not exist" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.460649 4743 scope.go:117] "RemoveContainer" containerID="082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078" Mar 10 16:28:11 crc kubenswrapper[4743]: E0310 16:28:11.462423 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078\": container with ID starting with 082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078 not found: ID does not exist" containerID="082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.462476 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078"} err="failed to get container status \"082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078\": rpc error: code = NotFound desc = could not find container \"082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078\": container with ID starting with 082346599b4e339b7ba71d92a17bfe91bcd35f61e821c4ee864331f6078b8078 not found: ID does not exist" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.462510 4743 scope.go:117] "RemoveContainer" containerID="a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904" Mar 10 16:28:11 crc kubenswrapper[4743]: E0310 16:28:11.462890 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904\": container with ID starting with a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904 not found: ID does not exist" containerID="a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.462928 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904"} err="failed to get container status \"a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904\": rpc error: code = NotFound desc = could not find container \"a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904\": container with ID starting with a0445fcc62349d532dcb54675f68896c766d16291ebefc834170b1e82819a904 not found: ID does not exist" Mar 10 16:28:11 crc kubenswrapper[4743]: I0310 16:28:11.946860 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" path="/var/lib/kubelet/pods/34f73ddb-d3fb-436f-b81c-48a75234a43b/volumes" Mar 10 16:28:18 crc kubenswrapper[4743]: I0310 16:28:18.015180 4743 scope.go:117] "RemoveContainer" containerID="5c210c8062114e4335ff0d1abfb60d71a7ff31fe124c3e911038f78b7dc44a94" Mar 10 16:28:20 crc kubenswrapper[4743]: I0310 16:28:20.731846 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-gsvwz_5525e521-3469-4b99-8fc0-0894d01bfbb1/manager/0.log" Mar 10 16:28:20 crc kubenswrapper[4743]: I0310 16:28:20.969989 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz_412e2285-16ae-49ca-a514-ee3f1297bfd4/util/0.log" Mar 10 16:28:21 crc kubenswrapper[4743]: I0310 16:28:21.668182 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz_412e2285-16ae-49ca-a514-ee3f1297bfd4/util/0.log" Mar 10 16:28:21 crc kubenswrapper[4743]: I0310 16:28:21.721602 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz_412e2285-16ae-49ca-a514-ee3f1297bfd4/pull/0.log" Mar 10 16:28:21 crc kubenswrapper[4743]: I0310 16:28:21.918797 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz_412e2285-16ae-49ca-a514-ee3f1297bfd4/pull/0.log" Mar 10 16:28:22 crc kubenswrapper[4743]: I0310 16:28:22.096388 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz_412e2285-16ae-49ca-a514-ee3f1297bfd4/util/0.log" Mar 10 16:28:22 crc kubenswrapper[4743]: I0310 16:28:22.150145 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz_412e2285-16ae-49ca-a514-ee3f1297bfd4/pull/0.log" Mar 10 16:28:22 crc kubenswrapper[4743]: I0310 16:28:22.289650 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f244mkz_412e2285-16ae-49ca-a514-ee3f1297bfd4/extract/0.log" Mar 10 16:28:22 crc kubenswrapper[4743]: I0310 16:28:22.621729 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-bdbfx_4b4416c1-939f-4740-bd01-f45d9cfd8822/manager/0.log" Mar 10 16:28:22 crc kubenswrapper[4743]: I0310 16:28:22.887181 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-lcr6h_8485c807-58f7-45c6-845f-1bc881558553/manager/0.log" Mar 10 16:28:22 crc kubenswrapper[4743]: I0310 16:28:22.945336 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-97gbz_33b59c71-8ce1-468d-b617-50bc021d41b2/manager/0.log" Mar 10 16:28:23 crc kubenswrapper[4743]: I0310 16:28:23.242029 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-tsd2m_bf42ffd4-e446-4dee-b343-b8d64dcb8e2c/manager/0.log" Mar 10 16:28:23 crc kubenswrapper[4743]: I0310 16:28:23.541722 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-4vj77_364fba36-380b-47fb-9f0d-38585fb94bac/manager/0.log" Mar 10 16:28:23 crc kubenswrapper[4743]: I0310 16:28:23.578526 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-cvxq6_45ac07a3-b1f8-4232-937d-6a7275aac026/manager/0.log" Mar 10 16:28:23 crc kubenswrapper[4743]: I0310 16:28:23.885508 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-r4gcw_a23de576-a42f-4a68-8e30-5a71791b89e0/manager/0.log" Mar 10 16:28:24 crc kubenswrapper[4743]: I0310 16:28:24.069702 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-vcztk_13525a0b-792c-4304-9b21-4ff84f391e20/manager/0.log" Mar 10 16:28:24 crc kubenswrapper[4743]: I0310 16:28:24.214825 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-s8gf8_e17c0b05-44be-478d-8674-fe42a01f9397/manager/0.log" Mar 10 16:28:24 crc kubenswrapper[4743]: I0310 16:28:24.424958 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-v5fw5_38bf9e64-a759-40d7-a3f9-f443299160ec/manager/0.log" Mar 10 16:28:24 crc kubenswrapper[4743]: I0310 16:28:24.615088 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-kn6vc_a70223a5-4e89-4b37-a031-6b93350120c2/manager/0.log" Mar 10 16:28:24 crc kubenswrapper[4743]: I0310 16:28:24.650550 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-vqpbs_7b3fecc3-bc35-40a2-b927-65cda4fbe04d/manager/0.log" Mar 10 16:28:24 crc kubenswrapper[4743]: I0310 16:28:24.795249 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7gvp5w_a9eaff72-01ad-4abf-a515-92bdda950b0f/manager/0.log" Mar 10 16:28:25 crc kubenswrapper[4743]: I0310 16:28:25.130876 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7c7f7d994-spvkt_4cd2825d-785d-46c7-8d95-0237100bd129/operator/0.log" Mar 10 16:28:25 crc kubenswrapper[4743]: I0310 16:28:25.228990 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-pn4jm_bdb5f935-4758-4d10-8e11-1e884efabce6/registry-server/0.log" Mar 10 16:28:25 crc kubenswrapper[4743]: I0310 16:28:25.766956 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-wbbgz_fa1fe66b-2fd5-40fe-8062-9fd96edd519a/manager/0.log" Mar 10 16:28:25 crc kubenswrapper[4743]: I0310 16:28:25.837754 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-s7fnn_e6d20795-5386-4ef1-929b-21659af4ecc6/manager/0.log" Mar 10 16:28:26 crc kubenswrapper[4743]: I0310 16:28:26.042877 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pmnxq_94a79fc9-d43a-4a8d-9d33-1bcfd7306537/operator/0.log" Mar 10 16:28:26 crc kubenswrapper[4743]: I0310 16:28:26.085162 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-jfbcg_7a579dd7-32af-4705-913a-12ec54a71572/manager/0.log" Mar 10 16:28:26 crc kubenswrapper[4743]: I0310 16:28:26.354456 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-hfrrc_bd198f14-3a65-4742-bf29-7938e52d3284/manager/0.log" Mar 10 16:28:26 crc kubenswrapper[4743]: I0310 16:28:26.508920 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-kmmz6_65c955aa-82e1-4acc-9c36-7029482fcac3/manager/0.log" Mar 10 16:28:26 crc kubenswrapper[4743]: I0310 16:28:26.647671 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-4652x_6d72554f-3af0-4d26-91f6-f0375b131c31/manager/0.log" Mar 10 16:28:27 crc kubenswrapper[4743]: I0310 16:28:27.009019 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76d6f6bb5f-8bnxf_9f66984b-7e2a-4644-892d-96c8a0268ab6/manager/0.log" Mar 10 16:28:31 crc kubenswrapper[4743]: I0310 16:28:31.792266 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-8ppkz_b39bb6a2-9c5a-4acc-bbae-fe5d63a971c8/manager/0.log" Mar 10 16:28:48 crc kubenswrapper[4743]: I0310 16:28:48.885412 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nr4wq_afc92a5c-a4ef-4ae8-9425-9787ea43ca0a/control-plane-machine-set-operator/0.log" Mar 10 16:28:49 crc kubenswrapper[4743]: I0310 16:28:49.084520 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mlm2p_d02c0611-45a7-4760-8187-4fd2b39f7dd4/kube-rbac-proxy/0.log" Mar 10 16:28:49 crc kubenswrapper[4743]: I0310 16:28:49.100582 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mlm2p_d02c0611-45a7-4760-8187-4fd2b39f7dd4/machine-api-operator/0.log" Mar 10 16:29:04 crc kubenswrapper[4743]: I0310 16:29:04.695494 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-8png4_5a8813f8-fe38-4cb2-a737-ac9e4abce6a9/cert-manager-controller/0.log" Mar 10 16:29:04 crc kubenswrapper[4743]: I0310 16:29:04.756807 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-2p4r2_7b482c75-7f98-46ad-8ad5-ff3df46f8965/cert-manager-cainjector/0.log" Mar 10 16:29:04 crc kubenswrapper[4743]: I0310 16:29:04.940325 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-65ln8_5d7842a5-2d26-49cc-b4bf-c5afec234f08/cert-manager-webhook/0.log" Mar 10 16:29:11 crc kubenswrapper[4743]: I0310 16:29:11.252632 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:29:11 crc kubenswrapper[4743]: I0310 16:29:11.253200 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:29:17 crc kubenswrapper[4743]: I0310 16:29:17.860132 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-74xhc_12b52c8d-3a76-46a0-a2fe-6279e2537c9f/nmstate-console-plugin/0.log" Mar 10 16:29:18 crc kubenswrapper[4743]: I0310 16:29:18.056336 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xgtpr_a99b4c91-7907-4849-81f1-47627fe794fa/nmstate-handler/0.log" Mar 10 16:29:18 crc kubenswrapper[4743]: I0310 16:29:18.159390 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-mzqq9_3532faa1-be5a-4242-8596-aa02e6263960/kube-rbac-proxy/0.log" Mar 10 16:29:18 crc kubenswrapper[4743]: I0310 16:29:18.207438 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-mzqq9_3532faa1-be5a-4242-8596-aa02e6263960/nmstate-metrics/0.log" Mar 10 16:29:18 crc kubenswrapper[4743]: I0310 16:29:18.308851 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-66r4d_c663bb83-e937-4db5-94e3-87f5253b27c9/nmstate-operator/0.log" Mar 10 16:29:18 crc kubenswrapper[4743]: I0310 16:29:18.450147 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-926jh_59af88c0-1e3a-4bf3-8ad3-4d11a0248c70/nmstate-webhook/0.log" Mar 10 16:29:41 crc kubenswrapper[4743]: I0310 16:29:41.252589 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:29:41 crc kubenswrapper[4743]: I0310 16:29:41.253196 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:29:49 crc kubenswrapper[4743]: I0310 16:29:49.569290 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-clgfh_ce84f3bd-aef2-4e6e-8d42-829c75efc758/controller/0.log" Mar 10 16:29:49 crc kubenswrapper[4743]: I0310 16:29:49.653497 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-clgfh_ce84f3bd-aef2-4e6e-8d42-829c75efc758/kube-rbac-proxy/0.log" Mar 10 16:29:49 crc kubenswrapper[4743]: I0310 16:29:49.928774 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-frr-files/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.037899 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-frr-files/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.117649 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-reloader/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.126397 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-reloader/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.138157 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-metrics/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.315877 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-reloader/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.321648 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-metrics/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.330382 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-frr-files/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.366462 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-metrics/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.579053 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-reloader/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.596291 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-frr-files/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.629589 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/cp-metrics/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.667644 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/controller/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.884661 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/frr-metrics/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.897969 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/kube-rbac-proxy/0.log" Mar 10 16:29:50 crc kubenswrapper[4743]: I0310 16:29:50.966635 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/kube-rbac-proxy-frr/0.log" Mar 10 16:29:51 crc kubenswrapper[4743]: I0310 16:29:51.160892 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/reloader/0.log" Mar 10 16:29:51 crc kubenswrapper[4743]: I0310 16:29:51.219854 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-c6tlb_596c245f-700e-4539-a0f9-a9c30906383a/frr-k8s-webhook-server/0.log" Mar 10 16:29:51 crc kubenswrapper[4743]: I0310 16:29:51.471355 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85fcd4c69f-jxzhg_7df72fd9-f705-42a8-b630-88cdf35f8874/manager/0.log" Mar 10 16:29:51 crc kubenswrapper[4743]: I0310 16:29:51.695781 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fnzjw_0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1/kube-rbac-proxy/0.log" Mar 10 16:29:51 crc kubenswrapper[4743]: I0310 16:29:51.707731 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66b54d9848-tkr95_43bd5d43-af77-493a-a93e-0110cfd5c307/webhook-server/0.log" Mar 10 16:29:52 crc kubenswrapper[4743]: I0310 16:29:52.427212 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fnzjw_0fa0eb2e-e4a0-47a2-b91d-113b9c3d68d1/speaker/0.log" Mar 10 16:29:52 crc kubenswrapper[4743]: I0310 16:29:52.845284 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7lm5r_ec1cebe8-2ebe-4319-8df4-b6616411c83a/frr/0.log" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.163147 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552670-wpqm9"] Mar 10 16:30:00 crc kubenswrapper[4743]: E0310 16:30:00.163947 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerName="extract-utilities" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.163961 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerName="extract-utilities" Mar 10 16:30:00 crc kubenswrapper[4743]: E0310 16:30:00.163982 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403b250a-af4e-4306-9544-a821adfaf9d8" containerName="oc" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.163992 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="403b250a-af4e-4306-9544-a821adfaf9d8" containerName="oc" Mar 10 16:30:00 crc kubenswrapper[4743]: E0310 16:30:00.164002 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerName="extract-content" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.164008 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerName="extract-content" Mar 10 16:30:00 crc kubenswrapper[4743]: E0310 16:30:00.164021 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerName="registry-server" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.164027 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerName="registry-server" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.164223 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f73ddb-d3fb-436f-b81c-48a75234a43b" containerName="registry-server" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.164238 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="403b250a-af4e-4306-9544-a821adfaf9d8" containerName="oc" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.164920 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552670-wpqm9" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.167919 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.168302 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.168468 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.188292 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552670-wpqm9"] Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.291887 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp"] Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.293207 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.295789 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.296127 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.301607 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp"] Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.354404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2dm\" (UniqueName: \"kubernetes.io/projected/d9233029-7319-4fed-b056-0bbeb8831ac5-kube-api-access-xr2dm\") pod \"auto-csr-approver-29552670-wpqm9\" (UID: \"d9233029-7319-4fed-b056-0bbeb8831ac5\") " pod="openshift-infra/auto-csr-approver-29552670-wpqm9" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.456522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhgt\" (UniqueName: \"kubernetes.io/projected/03b63f95-0d97-49ac-91d3-636a50fcfe43-kube-api-access-4lhgt\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.456793 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03b63f95-0d97-49ac-91d3-636a50fcfe43-secret-volume\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.457102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2dm\" (UniqueName: \"kubernetes.io/projected/d9233029-7319-4fed-b056-0bbeb8831ac5-kube-api-access-xr2dm\") pod \"auto-csr-approver-29552670-wpqm9\" (UID: \"d9233029-7319-4fed-b056-0bbeb8831ac5\") " pod="openshift-infra/auto-csr-approver-29552670-wpqm9" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.457427 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03b63f95-0d97-49ac-91d3-636a50fcfe43-config-volume\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.478731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2dm\" (UniqueName: \"kubernetes.io/projected/d9233029-7319-4fed-b056-0bbeb8831ac5-kube-api-access-xr2dm\") pod \"auto-csr-approver-29552670-wpqm9\" (UID: \"d9233029-7319-4fed-b056-0bbeb8831ac5\") " pod="openshift-infra/auto-csr-approver-29552670-wpqm9" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.495937 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552670-wpqm9" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.558959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03b63f95-0d97-49ac-91d3-636a50fcfe43-secret-volume\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.559113 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03b63f95-0d97-49ac-91d3-636a50fcfe43-config-volume\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.559198 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lhgt\" (UniqueName: \"kubernetes.io/projected/03b63f95-0d97-49ac-91d3-636a50fcfe43-kube-api-access-4lhgt\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.560247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03b63f95-0d97-49ac-91d3-636a50fcfe43-config-volume\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.567705 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03b63f95-0d97-49ac-91d3-636a50fcfe43-secret-volume\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.579489 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lhgt\" (UniqueName: \"kubernetes.io/projected/03b63f95-0d97-49ac-91d3-636a50fcfe43-kube-api-access-4lhgt\") pod \"collect-profiles-29552670-z67wp\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.611697 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:00 crc kubenswrapper[4743]: I0310 16:30:00.980761 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552670-wpqm9"] Mar 10 16:30:01 crc kubenswrapper[4743]: I0310 16:30:01.127661 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp"] Mar 10 16:30:01 crc kubenswrapper[4743]: W0310 16:30:01.128859 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b63f95_0d97_49ac_91d3_636a50fcfe43.slice/crio-bc55f2edb8a99191cac604d7fe04225c72a4fd63a5d35d7a4869f89f01044f97 WatchSource:0}: Error finding container bc55f2edb8a99191cac604d7fe04225c72a4fd63a5d35d7a4869f89f01044f97: Status 404 returned error can't find the container with id bc55f2edb8a99191cac604d7fe04225c72a4fd63a5d35d7a4869f89f01044f97 Mar 10 16:30:01 crc kubenswrapper[4743]: I0310 16:30:01.390381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" event={"ID":"03b63f95-0d97-49ac-91d3-636a50fcfe43","Type":"ContainerStarted","Data":"818ad35accecbc532dd62d54a091e15566b21ccccb2d547586d17047f48acbee"} Mar 10 16:30:01 crc kubenswrapper[4743]: I0310 16:30:01.390458 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" event={"ID":"03b63f95-0d97-49ac-91d3-636a50fcfe43","Type":"ContainerStarted","Data":"bc55f2edb8a99191cac604d7fe04225c72a4fd63a5d35d7a4869f89f01044f97"} Mar 10 16:30:01 crc kubenswrapper[4743]: I0310 16:30:01.392684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552670-wpqm9" event={"ID":"d9233029-7319-4fed-b056-0bbeb8831ac5","Type":"ContainerStarted","Data":"0a3ac8d48b5d3ad4ad70ab64d6bb3cb52170df070d2c95f767f1af394d1faa85"} Mar 10 16:30:01 crc kubenswrapper[4743]: I0310 16:30:01.422403 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" podStartSLOduration=1.422370306 podStartE2EDuration="1.422370306s" podCreationTimestamp="2026-03-10 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:30:01.410460758 +0000 UTC m=+5066.117275506" watchObservedRunningTime="2026-03-10 16:30:01.422370306 +0000 UTC m=+5066.129185094" Mar 10 16:30:02 crc kubenswrapper[4743]: I0310 16:30:02.403347 4743 generic.go:334] "Generic (PLEG): container finished" podID="03b63f95-0d97-49ac-91d3-636a50fcfe43" containerID="818ad35accecbc532dd62d54a091e15566b21ccccb2d547586d17047f48acbee" exitCode=0 Mar 10 16:30:02 crc kubenswrapper[4743]: I0310 16:30:02.403451 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" event={"ID":"03b63f95-0d97-49ac-91d3-636a50fcfe43","Type":"ContainerDied","Data":"818ad35accecbc532dd62d54a091e15566b21ccccb2d547586d17047f48acbee"} Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.414624 4743 generic.go:334] "Generic (PLEG): container finished" podID="d9233029-7319-4fed-b056-0bbeb8831ac5" containerID="19d722dcdf37551f6498072e6554fa1cbf678e3c92ba17ece31be8d43bfed3c2" exitCode=0 Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.414731 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552670-wpqm9" event={"ID":"d9233029-7319-4fed-b056-0bbeb8831ac5","Type":"ContainerDied","Data":"19d722dcdf37551f6498072e6554fa1cbf678e3c92ba17ece31be8d43bfed3c2"} Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.784944 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.933599 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lhgt\" (UniqueName: \"kubernetes.io/projected/03b63f95-0d97-49ac-91d3-636a50fcfe43-kube-api-access-4lhgt\") pod \"03b63f95-0d97-49ac-91d3-636a50fcfe43\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.933975 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03b63f95-0d97-49ac-91d3-636a50fcfe43-secret-volume\") pod \"03b63f95-0d97-49ac-91d3-636a50fcfe43\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.934184 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03b63f95-0d97-49ac-91d3-636a50fcfe43-config-volume\") pod \"03b63f95-0d97-49ac-91d3-636a50fcfe43\" (UID: \"03b63f95-0d97-49ac-91d3-636a50fcfe43\") " Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.934966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b63f95-0d97-49ac-91d3-636a50fcfe43-config-volume" (OuterVolumeSpecName: "config-volume") pod "03b63f95-0d97-49ac-91d3-636a50fcfe43" (UID: "03b63f95-0d97-49ac-91d3-636a50fcfe43"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.940839 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b63f95-0d97-49ac-91d3-636a50fcfe43-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "03b63f95-0d97-49ac-91d3-636a50fcfe43" (UID: "03b63f95-0d97-49ac-91d3-636a50fcfe43"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:30:03 crc kubenswrapper[4743]: I0310 16:30:03.975362 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b63f95-0d97-49ac-91d3-636a50fcfe43-kube-api-access-4lhgt" (OuterVolumeSpecName: "kube-api-access-4lhgt") pod "03b63f95-0d97-49ac-91d3-636a50fcfe43" (UID: "03b63f95-0d97-49ac-91d3-636a50fcfe43"). InnerVolumeSpecName "kube-api-access-4lhgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.036917 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03b63f95-0d97-49ac-91d3-636a50fcfe43-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.036951 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lhgt\" (UniqueName: \"kubernetes.io/projected/03b63f95-0d97-49ac-91d3-636a50fcfe43-kube-api-access-4lhgt\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.036960 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03b63f95-0d97-49ac-91d3-636a50fcfe43-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.424938 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.427895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-z67wp" event={"ID":"03b63f95-0d97-49ac-91d3-636a50fcfe43","Type":"ContainerDied","Data":"bc55f2edb8a99191cac604d7fe04225c72a4fd63a5d35d7a4869f89f01044f97"} Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.427943 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc55f2edb8a99191cac604d7fe04225c72a4fd63a5d35d7a4869f89f01044f97" Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.494632 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq"] Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.506202 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-qsbsq"] Mar 10 16:30:04 crc kubenswrapper[4743]: I0310 16:30:04.983952 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552670-wpqm9" Mar 10 16:30:05 crc kubenswrapper[4743]: I0310 16:30:05.159195 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr2dm\" (UniqueName: \"kubernetes.io/projected/d9233029-7319-4fed-b056-0bbeb8831ac5-kube-api-access-xr2dm\") pod \"d9233029-7319-4fed-b056-0bbeb8831ac5\" (UID: \"d9233029-7319-4fed-b056-0bbeb8831ac5\") " Mar 10 16:30:05 crc kubenswrapper[4743]: I0310 16:30:05.166850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9233029-7319-4fed-b056-0bbeb8831ac5-kube-api-access-xr2dm" (OuterVolumeSpecName: "kube-api-access-xr2dm") pod "d9233029-7319-4fed-b056-0bbeb8831ac5" (UID: "d9233029-7319-4fed-b056-0bbeb8831ac5"). InnerVolumeSpecName "kube-api-access-xr2dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:30:05 crc kubenswrapper[4743]: I0310 16:30:05.262347 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr2dm\" (UniqueName: \"kubernetes.io/projected/d9233029-7319-4fed-b056-0bbeb8831ac5-kube-api-access-xr2dm\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:05 crc kubenswrapper[4743]: I0310 16:30:05.445225 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552670-wpqm9" event={"ID":"d9233029-7319-4fed-b056-0bbeb8831ac5","Type":"ContainerDied","Data":"0a3ac8d48b5d3ad4ad70ab64d6bb3cb52170df070d2c95f767f1af394d1faa85"} Mar 10 16:30:05 crc kubenswrapper[4743]: I0310 16:30:05.445269 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552670-wpqm9" Mar 10 16:30:05 crc kubenswrapper[4743]: I0310 16:30:05.445288 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a3ac8d48b5d3ad4ad70ab64d6bb3cb52170df070d2c95f767f1af394d1faa85" Mar 10 16:30:05 crc kubenswrapper[4743]: I0310 16:30:05.929170 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="054edb77-7d07-4c2b-adf6-c50909d6dc2b" path="/var/lib/kubelet/pods/054edb77-7d07-4c2b-adf6-c50909d6dc2b/volumes" Mar 10 16:30:06 crc kubenswrapper[4743]: I0310 16:30:06.056392 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552664-wzplb"] Mar 10 16:30:06 crc kubenswrapper[4743]: I0310 16:30:06.067525 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552664-wzplb"] Mar 10 16:30:07 crc kubenswrapper[4743]: I0310 16:30:07.925550 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19da3dc-cca4-4b3f-9970-0f6b44098031" path="/var/lib/kubelet/pods/a19da3dc-cca4-4b3f-9970-0f6b44098031/volumes" Mar 10 16:30:08 crc kubenswrapper[4743]: I0310 16:30:08.370962 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk_e3871ddd-8c74-4c7f-a368-75bf19bdd67a/util/0.log" Mar 10 16:30:08 crc kubenswrapper[4743]: I0310 16:30:08.589168 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk_e3871ddd-8c74-4c7f-a368-75bf19bdd67a/util/0.log" Mar 10 16:30:08 crc kubenswrapper[4743]: I0310 16:30:08.609476 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk_e3871ddd-8c74-4c7f-a368-75bf19bdd67a/pull/0.log" Mar 10 16:30:08 crc kubenswrapper[4743]: I0310 16:30:08.646841 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk_e3871ddd-8c74-4c7f-a368-75bf19bdd67a/pull/0.log" Mar 10 16:30:08 crc kubenswrapper[4743]: I0310 16:30:08.822898 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk_e3871ddd-8c74-4c7f-a368-75bf19bdd67a/util/0.log" Mar 10 16:30:08 crc kubenswrapper[4743]: I0310 16:30:08.897575 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk_e3871ddd-8c74-4c7f-a368-75bf19bdd67a/pull/0.log" Mar 10 16:30:08 crc kubenswrapper[4743]: I0310 16:30:08.897680 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828c2dk_e3871ddd-8c74-4c7f-a368-75bf19bdd67a/extract/0.log" Mar 10 16:30:09 crc kubenswrapper[4743]: I0310 16:30:09.013010 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2jrfb_e008807d-9026-49b7-9a83-c375cd1f23cc/extract-utilities/0.log" Mar 10 16:30:09 crc kubenswrapper[4743]: I0310 16:30:09.237787 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2jrfb_e008807d-9026-49b7-9a83-c375cd1f23cc/extract-utilities/0.log" Mar 10 16:30:09 crc kubenswrapper[4743]: I0310 16:30:09.246447 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2jrfb_e008807d-9026-49b7-9a83-c375cd1f23cc/extract-content/0.log" Mar 10 16:30:09 crc kubenswrapper[4743]: I0310 16:30:09.258104 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2jrfb_e008807d-9026-49b7-9a83-c375cd1f23cc/extract-content/0.log" Mar 10 16:30:09 crc kubenswrapper[4743]: I0310 16:30:09.505040 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2jrfb_e008807d-9026-49b7-9a83-c375cd1f23cc/extract-utilities/0.log" Mar 10 16:30:09 crc kubenswrapper[4743]: I0310 16:30:09.519775 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2jrfb_e008807d-9026-49b7-9a83-c375cd1f23cc/extract-content/0.log" Mar 10 16:30:09 crc kubenswrapper[4743]: I0310 16:30:09.719584 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lx45d_f6ffea72-0677-4f63-b0ba-5501881256da/extract-utilities/0.log" Mar 10 16:30:09 crc kubenswrapper[4743]: I0310 16:30:09.936228 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lx45d_f6ffea72-0677-4f63-b0ba-5501881256da/extract-content/0.log" Mar 10 16:30:10 crc kubenswrapper[4743]: I0310 16:30:10.009396 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lx45d_f6ffea72-0677-4f63-b0ba-5501881256da/extract-utilities/0.log" Mar 10 16:30:10 crc kubenswrapper[4743]: I0310 16:30:10.041952 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lx45d_f6ffea72-0677-4f63-b0ba-5501881256da/extract-content/0.log" Mar 10 16:30:10 crc kubenswrapper[4743]: I0310 16:30:10.213408 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2jrfb_e008807d-9026-49b7-9a83-c375cd1f23cc/registry-server/0.log" Mar 10 16:30:10 crc kubenswrapper[4743]: I0310 16:30:10.484003 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lx45d_f6ffea72-0677-4f63-b0ba-5501881256da/extract-content/0.log" Mar 10 16:30:10 crc kubenswrapper[4743]: I0310 16:30:10.523008 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lx45d_f6ffea72-0677-4f63-b0ba-5501881256da/extract-utilities/0.log" Mar 10 16:30:10 crc kubenswrapper[4743]: I0310 16:30:10.784793 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5_04d8c366-53ae-4237-a441-9acd9c158909/util/0.log" Mar 10 16:30:10 crc kubenswrapper[4743]: I0310 16:30:10.937070 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5_04d8c366-53ae-4237-a441-9acd9c158909/util/0.log" Mar 10 16:30:10 crc kubenswrapper[4743]: I0310 16:30:10.979540 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5_04d8c366-53ae-4237-a441-9acd9c158909/pull/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.057578 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5_04d8c366-53ae-4237-a441-9acd9c158909/pull/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.252024 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.252081 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.252128 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.252941 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.253010 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" gracePeriod=600 Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.256291 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lx45d_f6ffea72-0677-4f63-b0ba-5501881256da/registry-server/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.262415 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5_04d8c366-53ae-4237-a441-9acd9c158909/extract/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.309320 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5_04d8c366-53ae-4237-a441-9acd9c158909/pull/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.309662 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ckwd5_04d8c366-53ae-4237-a441-9acd9c158909/util/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: E0310 16:30:11.381645 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.497725 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2tpls_f6c8824b-120a-4480-bdad-a18584d52bad/marketplace-operator/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.501285 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" exitCode=0 Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.501333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739"} Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.501380 4743 scope.go:117] "RemoveContainer" containerID="957f98a08ddd6714ecff509167edd29c6f257ddbc30383c819edc430ac24893f" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.502413 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:30:11 crc kubenswrapper[4743]: E0310 16:30:11.502879 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.580794 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9wrsh_26d17452-34b9-4c1e-a95c-3840493fe263/extract-utilities/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.785709 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9wrsh_26d17452-34b9-4c1e-a95c-3840493fe263/extract-content/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.785898 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9wrsh_26d17452-34b9-4c1e-a95c-3840493fe263/extract-utilities/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.800745 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9wrsh_26d17452-34b9-4c1e-a95c-3840493fe263/extract-content/0.log" Mar 10 16:30:11 crc kubenswrapper[4743]: I0310 16:30:11.984170 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9wrsh_26d17452-34b9-4c1e-a95c-3840493fe263/extract-utilities/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.004428 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9wrsh_26d17452-34b9-4c1e-a95c-3840493fe263/extract-content/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.186209 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9wrsh_26d17452-34b9-4c1e-a95c-3840493fe263/registry-server/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.315937 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zvq8n_3ebb183b-5e0b-4c6f-ad0d-96bd60832852/extract-utilities/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.444136 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zvq8n_3ebb183b-5e0b-4c6f-ad0d-96bd60832852/extract-utilities/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.460489 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zvq8n_3ebb183b-5e0b-4c6f-ad0d-96bd60832852/extract-content/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.481600 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zvq8n_3ebb183b-5e0b-4c6f-ad0d-96bd60832852/extract-content/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.639033 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zvq8n_3ebb183b-5e0b-4c6f-ad0d-96bd60832852/extract-utilities/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.658153 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zvq8n_3ebb183b-5e0b-4c6f-ad0d-96bd60832852/extract-content/0.log" Mar 10 16:30:12 crc kubenswrapper[4743]: I0310 16:30:12.791312 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zvq8n_3ebb183b-5e0b-4c6f-ad0d-96bd60832852/registry-server/0.log" Mar 10 16:30:18 crc kubenswrapper[4743]: I0310 16:30:18.169844 4743 scope.go:117] "RemoveContainer" containerID="d76631c845312f3b1b7694f669589e294c5e941eda404c319e3c3fe45e8a40a4" Mar 10 16:30:18 crc kubenswrapper[4743]: I0310 16:30:18.194464 4743 scope.go:117] "RemoveContainer" containerID="1fc7feddb092cc3a7d5152f0c965a8c29ce3e3eb3079ea4c66a76fd008c47a8d" Mar 10 16:30:23 crc kubenswrapper[4743]: I0310 16:30:23.915151 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:30:23 crc kubenswrapper[4743]: E0310 16:30:23.915846 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:30:37 crc kubenswrapper[4743]: I0310 16:30:37.915899 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:30:37 crc kubenswrapper[4743]: E0310 16:30:37.916687 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:30:45 crc kubenswrapper[4743]: E0310 16:30:45.474497 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.115:52884->38.102.83.115:40715: write tcp 38.102.83.115:52884->38.102.83.115:40715: write: broken pipe Mar 10 16:30:50 crc kubenswrapper[4743]: I0310 16:30:50.915004 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:30:50 crc kubenswrapper[4743]: E0310 16:30:50.916582 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:31:05 crc kubenswrapper[4743]: I0310 16:31:05.948431 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:31:05 crc kubenswrapper[4743]: E0310 16:31:05.950719 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:31:16 crc kubenswrapper[4743]: I0310 16:31:16.916657 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:31:16 crc kubenswrapper[4743]: E0310 16:31:16.917371 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:31:27 crc kubenswrapper[4743]: I0310 16:31:27.915866 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:31:27 crc kubenswrapper[4743]: E0310 16:31:27.916845 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:31:39 crc kubenswrapper[4743]: I0310 16:31:39.917244 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:31:39 crc kubenswrapper[4743]: E0310 16:31:39.919211 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:31:51 crc kubenswrapper[4743]: I0310 16:31:51.915567 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:31:51 crc kubenswrapper[4743]: E0310 16:31:51.916322 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.149504 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552672-zhxpv"] Mar 10 16:32:00 crc kubenswrapper[4743]: E0310 16:32:00.150624 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b63f95-0d97-49ac-91d3-636a50fcfe43" containerName="collect-profiles" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.150646 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b63f95-0d97-49ac-91d3-636a50fcfe43" containerName="collect-profiles" Mar 10 16:32:00 crc kubenswrapper[4743]: E0310 16:32:00.150665 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9233029-7319-4fed-b056-0bbeb8831ac5" containerName="oc" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.150674 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9233029-7319-4fed-b056-0bbeb8831ac5" containerName="oc" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.151911 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9233029-7319-4fed-b056-0bbeb8831ac5" containerName="oc" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.151962 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b63f95-0d97-49ac-91d3-636a50fcfe43" containerName="collect-profiles" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.155891 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552672-zhxpv" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.159650 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.159978 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.160120 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.189252 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552672-zhxpv"] Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.247274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6hvb\" (UniqueName: \"kubernetes.io/projected/c534b8a9-c60b-4307-967d-e1fe25a4a451-kube-api-access-v6hvb\") pod \"auto-csr-approver-29552672-zhxpv\" (UID: \"c534b8a9-c60b-4307-967d-e1fe25a4a451\") " pod="openshift-infra/auto-csr-approver-29552672-zhxpv" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.349316 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6hvb\" (UniqueName: \"kubernetes.io/projected/c534b8a9-c60b-4307-967d-e1fe25a4a451-kube-api-access-v6hvb\") pod \"auto-csr-approver-29552672-zhxpv\" (UID: \"c534b8a9-c60b-4307-967d-e1fe25a4a451\") " pod="openshift-infra/auto-csr-approver-29552672-zhxpv" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.662238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6hvb\" (UniqueName: \"kubernetes.io/projected/c534b8a9-c60b-4307-967d-e1fe25a4a451-kube-api-access-v6hvb\") pod \"auto-csr-approver-29552672-zhxpv\" (UID: \"c534b8a9-c60b-4307-967d-e1fe25a4a451\") " pod="openshift-infra/auto-csr-approver-29552672-zhxpv" Mar 10 16:32:00 crc kubenswrapper[4743]: I0310 16:32:00.785694 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552672-zhxpv" Mar 10 16:32:01 crc kubenswrapper[4743]: I0310 16:32:01.275573 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552672-zhxpv"] Mar 10 16:32:01 crc kubenswrapper[4743]: W0310 16:32:01.285052 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc534b8a9_c60b_4307_967d_e1fe25a4a451.slice/crio-a5f7f7aa140bd0849771c127358b746cdd01bc7e89a3db2c2450a1f7fea71641 WatchSource:0}: Error finding container a5f7f7aa140bd0849771c127358b746cdd01bc7e89a3db2c2450a1f7fea71641: Status 404 returned error can't find the container with id a5f7f7aa140bd0849771c127358b746cdd01bc7e89a3db2c2450a1f7fea71641 Mar 10 16:32:01 crc kubenswrapper[4743]: I0310 16:32:01.582364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552672-zhxpv" event={"ID":"c534b8a9-c60b-4307-967d-e1fe25a4a451","Type":"ContainerStarted","Data":"a5f7f7aa140bd0849771c127358b746cdd01bc7e89a3db2c2450a1f7fea71641"} Mar 10 16:32:02 crc kubenswrapper[4743]: I0310 16:32:02.915988 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:32:02 crc kubenswrapper[4743]: E0310 16:32:02.916643 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:32:04 crc kubenswrapper[4743]: I0310 16:32:04.616313 4743 generic.go:334] "Generic (PLEG): container finished" podID="c534b8a9-c60b-4307-967d-e1fe25a4a451" containerID="50a59119eee51b1af8077904069a78ac64ebdf1b610a58cfab7e9807bc73adba" exitCode=0 Mar 10 16:32:04 crc kubenswrapper[4743]: I0310 16:32:04.616409 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552672-zhxpv" event={"ID":"c534b8a9-c60b-4307-967d-e1fe25a4a451","Type":"ContainerDied","Data":"50a59119eee51b1af8077904069a78ac64ebdf1b610a58cfab7e9807bc73adba"} Mar 10 16:32:05 crc kubenswrapper[4743]: I0310 16:32:05.999577 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552672-zhxpv" Mar 10 16:32:06 crc kubenswrapper[4743]: I0310 16:32:06.084349 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6hvb\" (UniqueName: \"kubernetes.io/projected/c534b8a9-c60b-4307-967d-e1fe25a4a451-kube-api-access-v6hvb\") pod \"c534b8a9-c60b-4307-967d-e1fe25a4a451\" (UID: \"c534b8a9-c60b-4307-967d-e1fe25a4a451\") " Mar 10 16:32:06 crc kubenswrapper[4743]: I0310 16:32:06.098099 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c534b8a9-c60b-4307-967d-e1fe25a4a451-kube-api-access-v6hvb" (OuterVolumeSpecName: "kube-api-access-v6hvb") pod "c534b8a9-c60b-4307-967d-e1fe25a4a451" (UID: "c534b8a9-c60b-4307-967d-e1fe25a4a451"). InnerVolumeSpecName "kube-api-access-v6hvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:32:06 crc kubenswrapper[4743]: I0310 16:32:06.188983 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6hvb\" (UniqueName: \"kubernetes.io/projected/c534b8a9-c60b-4307-967d-e1fe25a4a451-kube-api-access-v6hvb\") on node \"crc\" DevicePath \"\"" Mar 10 16:32:06 crc kubenswrapper[4743]: I0310 16:32:06.639782 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552672-zhxpv" event={"ID":"c534b8a9-c60b-4307-967d-e1fe25a4a451","Type":"ContainerDied","Data":"a5f7f7aa140bd0849771c127358b746cdd01bc7e89a3db2c2450a1f7fea71641"} Mar 10 16:32:06 crc kubenswrapper[4743]: I0310 16:32:06.640175 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5f7f7aa140bd0849771c127358b746cdd01bc7e89a3db2c2450a1f7fea71641" Mar 10 16:32:06 crc kubenswrapper[4743]: I0310 16:32:06.640012 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552672-zhxpv" Mar 10 16:32:07 crc kubenswrapper[4743]: I0310 16:32:07.079709 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552666-dnlgn"] Mar 10 16:32:07 crc kubenswrapper[4743]: I0310 16:32:07.092219 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552666-dnlgn"] Mar 10 16:32:07 crc kubenswrapper[4743]: I0310 16:32:07.938854 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4dd393b-7153-474c-a81c-c07d0cb9d1db" path="/var/lib/kubelet/pods/e4dd393b-7153-474c-a81c-c07d0cb9d1db/volumes" Mar 10 16:32:14 crc kubenswrapper[4743]: I0310 16:32:14.917753 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:32:14 crc kubenswrapper[4743]: E0310 16:32:14.918915 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:32:18 crc kubenswrapper[4743]: I0310 16:32:18.360626 4743 scope.go:117] "RemoveContainer" containerID="d1527afc00637b519ad9be826bf34320cfd01a53af49438848c06d81c19bb260" Mar 10 16:32:28 crc kubenswrapper[4743]: I0310 16:32:28.916955 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:32:28 crc kubenswrapper[4743]: E0310 16:32:28.917722 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:32:33 crc kubenswrapper[4743]: I0310 16:32:33.953040 4743 generic.go:334] "Generic (PLEG): container finished" podID="16ef51b2-326c-403e-996c-2791378770a3" containerID="308e38ba2a0c8143709f8ccc5005d67787ed76e1a65bb0b9d6bffdb8c9288aea" exitCode=0 Mar 10 16:32:33 crc kubenswrapper[4743]: I0310 16:32:33.953104 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-87xtf/must-gather-75tqt" event={"ID":"16ef51b2-326c-403e-996c-2791378770a3","Type":"ContainerDied","Data":"308e38ba2a0c8143709f8ccc5005d67787ed76e1a65bb0b9d6bffdb8c9288aea"} Mar 10 16:32:33 crc kubenswrapper[4743]: I0310 16:32:33.955537 4743 scope.go:117] "RemoveContainer" containerID="308e38ba2a0c8143709f8ccc5005d67787ed76e1a65bb0b9d6bffdb8c9288aea" Mar 10 16:32:34 crc kubenswrapper[4743]: I0310 16:32:34.552286 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-87xtf_must-gather-75tqt_16ef51b2-326c-403e-996c-2791378770a3/gather/0.log" Mar 10 16:32:40 crc kubenswrapper[4743]: I0310 16:32:40.915543 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:32:40 crc kubenswrapper[4743]: E0310 16:32:40.916219 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:32:42 crc kubenswrapper[4743]: I0310 16:32:42.949590 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-87xtf/must-gather-75tqt"] Mar 10 16:32:42 crc kubenswrapper[4743]: I0310 16:32:42.950460 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-87xtf/must-gather-75tqt" podUID="16ef51b2-326c-403e-996c-2791378770a3" containerName="copy" containerID="cri-o://94741437692a7767a61752805f74806cec09d9eb745a9e200fb4843773457a15" gracePeriod=2 Mar 10 16:32:42 crc kubenswrapper[4743]: I0310 16:32:42.964434 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-87xtf/must-gather-75tqt"] Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.087115 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-87xtf_must-gather-75tqt_16ef51b2-326c-403e-996c-2791378770a3/copy/0.log" Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.090261 4743 generic.go:334] "Generic (PLEG): container finished" podID="16ef51b2-326c-403e-996c-2791378770a3" containerID="94741437692a7767a61752805f74806cec09d9eb745a9e200fb4843773457a15" exitCode=143 Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.441799 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-87xtf_must-gather-75tqt_16ef51b2-326c-403e-996c-2791378770a3/copy/0.log" Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.442482 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.584125 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16ef51b2-326c-403e-996c-2791378770a3-must-gather-output\") pod \"16ef51b2-326c-403e-996c-2791378770a3\" (UID: \"16ef51b2-326c-403e-996c-2791378770a3\") " Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.584245 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzhq6\" (UniqueName: \"kubernetes.io/projected/16ef51b2-326c-403e-996c-2791378770a3-kube-api-access-mzhq6\") pod \"16ef51b2-326c-403e-996c-2791378770a3\" (UID: \"16ef51b2-326c-403e-996c-2791378770a3\") " Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.593183 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ef51b2-326c-403e-996c-2791378770a3-kube-api-access-mzhq6" (OuterVolumeSpecName: "kube-api-access-mzhq6") pod "16ef51b2-326c-403e-996c-2791378770a3" (UID: "16ef51b2-326c-403e-996c-2791378770a3"). InnerVolumeSpecName "kube-api-access-mzhq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.686720 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzhq6\" (UniqueName: \"kubernetes.io/projected/16ef51b2-326c-403e-996c-2791378770a3-kube-api-access-mzhq6\") on node \"crc\" DevicePath \"\"" Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.849982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ef51b2-326c-403e-996c-2791378770a3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "16ef51b2-326c-403e-996c-2791378770a3" (UID: "16ef51b2-326c-403e-996c-2791378770a3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.897154 4743 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16ef51b2-326c-403e-996c-2791378770a3-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 16:32:43 crc kubenswrapper[4743]: I0310 16:32:43.927628 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ef51b2-326c-403e-996c-2791378770a3" path="/var/lib/kubelet/pods/16ef51b2-326c-403e-996c-2791378770a3/volumes" Mar 10 16:32:44 crc kubenswrapper[4743]: I0310 16:32:44.100632 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-87xtf_must-gather-75tqt_16ef51b2-326c-403e-996c-2791378770a3/copy/0.log" Mar 10 16:32:44 crc kubenswrapper[4743]: I0310 16:32:44.101172 4743 scope.go:117] "RemoveContainer" containerID="94741437692a7767a61752805f74806cec09d9eb745a9e200fb4843773457a15" Mar 10 16:32:44 crc kubenswrapper[4743]: I0310 16:32:44.101238 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-87xtf/must-gather-75tqt" Mar 10 16:32:44 crc kubenswrapper[4743]: I0310 16:32:44.131987 4743 scope.go:117] "RemoveContainer" containerID="308e38ba2a0c8143709f8ccc5005d67787ed76e1a65bb0b9d6bffdb8c9288aea" Mar 10 16:32:55 crc kubenswrapper[4743]: I0310 16:32:55.923683 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:32:55 crc kubenswrapper[4743]: E0310 16:32:55.924599 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:33:06 crc kubenswrapper[4743]: I0310 16:33:06.914990 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:33:06 crc kubenswrapper[4743]: E0310 16:33:06.915804 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:33:18 crc kubenswrapper[4743]: I0310 16:33:18.453217 4743 scope.go:117] "RemoveContainer" containerID="479fcfcd456f84603df779cd452017c57731e8a2b56775d9817a0ce5c0cc927a" Mar 10 16:33:18 crc kubenswrapper[4743]: I0310 16:33:18.479715 4743 scope.go:117] "RemoveContainer" containerID="5e6181dfae88f446ffb5ec81617ca2aaed85ac07b50486ea013c0d21c5c3d8a4" Mar 10 16:33:21 crc kubenswrapper[4743]: I0310 16:33:21.915397 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:33:21 crc kubenswrapper[4743]: E0310 16:33:21.916374 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:33:32 crc kubenswrapper[4743]: I0310 16:33:32.916445 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:33:32 crc kubenswrapper[4743]: E0310 16:33:32.918602 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.310281 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s96r5"] Mar 10 16:33:36 crc kubenswrapper[4743]: E0310 16:33:36.311226 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ef51b2-326c-403e-996c-2791378770a3" containerName="copy" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.311242 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ef51b2-326c-403e-996c-2791378770a3" containerName="copy" Mar 10 16:33:36 crc kubenswrapper[4743]: E0310 16:33:36.311253 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ef51b2-326c-403e-996c-2791378770a3" containerName="gather" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.311259 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ef51b2-326c-403e-996c-2791378770a3" containerName="gather" Mar 10 16:33:36 crc kubenswrapper[4743]: E0310 16:33:36.311270 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c534b8a9-c60b-4307-967d-e1fe25a4a451" containerName="oc" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.311279 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c534b8a9-c60b-4307-967d-e1fe25a4a451" containerName="oc" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.311486 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c534b8a9-c60b-4307-967d-e1fe25a4a451" containerName="oc" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.311502 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ef51b2-326c-403e-996c-2791378770a3" containerName="copy" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.311510 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ef51b2-326c-403e-996c-2791378770a3" containerName="gather" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.312999 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.342138 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s96r5"] Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.432244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-utilities\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.432483 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j22mq\" (UniqueName: \"kubernetes.io/projected/8f8d69ee-a4cc-488b-acef-e942adbb42ab-kube-api-access-j22mq\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.432712 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-catalog-content\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.534665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j22mq\" (UniqueName: \"kubernetes.io/projected/8f8d69ee-a4cc-488b-acef-e942adbb42ab-kube-api-access-j22mq\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.535091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-catalog-content\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.535231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-utilities\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.535753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-utilities\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.536071 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-catalog-content\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.557473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j22mq\" (UniqueName: \"kubernetes.io/projected/8f8d69ee-a4cc-488b-acef-e942adbb42ab-kube-api-access-j22mq\") pod \"certified-operators-s96r5\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:36 crc kubenswrapper[4743]: I0310 16:33:36.637491 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:37 crc kubenswrapper[4743]: I0310 16:33:37.105636 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s96r5"] Mar 10 16:33:37 crc kubenswrapper[4743]: I0310 16:33:37.634791 4743 generic.go:334] "Generic (PLEG): container finished" podID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerID="758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa" exitCode=0 Mar 10 16:33:37 crc kubenswrapper[4743]: I0310 16:33:37.634878 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96r5" event={"ID":"8f8d69ee-a4cc-488b-acef-e942adbb42ab","Type":"ContainerDied","Data":"758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa"} Mar 10 16:33:37 crc kubenswrapper[4743]: I0310 16:33:37.635118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96r5" event={"ID":"8f8d69ee-a4cc-488b-acef-e942adbb42ab","Type":"ContainerStarted","Data":"76ad206d235a0e8f21e6e4ac046e9230d4cca13c0c27ebbb31ca184f468b4675"} Mar 10 16:33:37 crc kubenswrapper[4743]: I0310 16:33:37.639053 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:33:39 crc kubenswrapper[4743]: I0310 16:33:39.660484 4743 generic.go:334] "Generic (PLEG): container finished" podID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerID="5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b" exitCode=0 Mar 10 16:33:39 crc kubenswrapper[4743]: I0310 16:33:39.660572 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96r5" event={"ID":"8f8d69ee-a4cc-488b-acef-e942adbb42ab","Type":"ContainerDied","Data":"5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b"} Mar 10 16:33:40 crc kubenswrapper[4743]: I0310 16:33:40.672801 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96r5" event={"ID":"8f8d69ee-a4cc-488b-acef-e942adbb42ab","Type":"ContainerStarted","Data":"c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c"} Mar 10 16:33:40 crc kubenswrapper[4743]: I0310 16:33:40.696105 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s96r5" podStartSLOduration=2.070854386 podStartE2EDuration="4.696085727s" podCreationTimestamp="2026-03-10 16:33:36 +0000 UTC" firstStartedPulling="2026-03-10 16:33:37.638532374 +0000 UTC m=+5282.345347132" lastFinishedPulling="2026-03-10 16:33:40.263763715 +0000 UTC m=+5284.970578473" observedRunningTime="2026-03-10 16:33:40.691213308 +0000 UTC m=+5285.398028056" watchObservedRunningTime="2026-03-10 16:33:40.696085727 +0000 UTC m=+5285.402900485" Mar 10 16:33:45 crc kubenswrapper[4743]: I0310 16:33:45.923368 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:33:45 crc kubenswrapper[4743]: E0310 16:33:45.924268 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:33:46 crc kubenswrapper[4743]: I0310 16:33:46.637569 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:46 crc kubenswrapper[4743]: I0310 16:33:46.638243 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:46 crc kubenswrapper[4743]: I0310 16:33:46.707269 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:46 crc kubenswrapper[4743]: I0310 16:33:46.773440 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:47 crc kubenswrapper[4743]: I0310 16:33:46.942655 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s96r5"] Mar 10 16:33:48 crc kubenswrapper[4743]: I0310 16:33:48.737764 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s96r5" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerName="registry-server" containerID="cri-o://c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c" gracePeriod=2 Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.209683 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.305791 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-utilities\") pod \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.305878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-catalog-content\") pod \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.305923 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j22mq\" (UniqueName: \"kubernetes.io/projected/8f8d69ee-a4cc-488b-acef-e942adbb42ab-kube-api-access-j22mq\") pod \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\" (UID: \"8f8d69ee-a4cc-488b-acef-e942adbb42ab\") " Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.308671 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-utilities" (OuterVolumeSpecName: "utilities") pod "8f8d69ee-a4cc-488b-acef-e942adbb42ab" (UID: "8f8d69ee-a4cc-488b-acef-e942adbb42ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.314030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8d69ee-a4cc-488b-acef-e942adbb42ab-kube-api-access-j22mq" (OuterVolumeSpecName: "kube-api-access-j22mq") pod "8f8d69ee-a4cc-488b-acef-e942adbb42ab" (UID: "8f8d69ee-a4cc-488b-acef-e942adbb42ab"). InnerVolumeSpecName "kube-api-access-j22mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.408143 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.408175 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j22mq\" (UniqueName: \"kubernetes.io/projected/8f8d69ee-a4cc-488b-acef-e942adbb42ab-kube-api-access-j22mq\") on node \"crc\" DevicePath \"\"" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.750519 4743 generic.go:334] "Generic (PLEG): container finished" podID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerID="c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c" exitCode=0 Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.750624 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s96r5" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.751956 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96r5" event={"ID":"8f8d69ee-a4cc-488b-acef-e942adbb42ab","Type":"ContainerDied","Data":"c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c"} Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.752084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96r5" event={"ID":"8f8d69ee-a4cc-488b-acef-e942adbb42ab","Type":"ContainerDied","Data":"76ad206d235a0e8f21e6e4ac046e9230d4cca13c0c27ebbb31ca184f468b4675"} Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.752203 4743 scope.go:117] "RemoveContainer" containerID="c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.779964 4743 scope.go:117] "RemoveContainer" containerID="5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.836860 4743 scope.go:117] "RemoveContainer" containerID="758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.860107 4743 scope.go:117] "RemoveContainer" containerID="c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c" Mar 10 16:33:49 crc kubenswrapper[4743]: E0310 16:33:49.860729 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c\": container with ID starting with c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c not found: ID does not exist" containerID="c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.860787 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c"} err="failed to get container status \"c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c\": rpc error: code = NotFound desc = could not find container \"c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c\": container with ID starting with c85dee00ca7ba8a50d6dd8084bdb892e241b9926d8a31ba17230d32127b8d62c not found: ID does not exist" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.860827 4743 scope.go:117] "RemoveContainer" containerID="5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b" Mar 10 16:33:49 crc kubenswrapper[4743]: E0310 16:33:49.861379 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b\": container with ID starting with 5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b not found: ID does not exist" containerID="5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.861457 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b"} err="failed to get container status \"5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b\": rpc error: code = NotFound desc = could not find container \"5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b\": container with ID starting with 5e851c1dfcc8af35ed7efb28f04436e16baf81e075cad77d490093de86995e8b not found: ID does not exist" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.861500 4743 scope.go:117] "RemoveContainer" containerID="758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa" Mar 10 16:33:49 crc kubenswrapper[4743]: E0310 16:33:49.861831 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa\": container with ID starting with 758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa not found: ID does not exist" containerID="758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.861857 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa"} err="failed to get container status \"758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa\": rpc error: code = NotFound desc = could not find container \"758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa\": container with ID starting with 758fbd9db6ced92fdb721a16943c10dccbb79ba18bdf7a105bdd2aea8e9a2cfa not found: ID does not exist" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.917725 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f8d69ee-a4cc-488b-acef-e942adbb42ab" (UID: "8f8d69ee-a4cc-488b-acef-e942adbb42ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:33:49 crc kubenswrapper[4743]: I0310 16:33:49.919593 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f8d69ee-a4cc-488b-acef-e942adbb42ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:33:50 crc kubenswrapper[4743]: I0310 16:33:50.073987 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s96r5"] Mar 10 16:33:50 crc kubenswrapper[4743]: I0310 16:33:50.083918 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s96r5"] Mar 10 16:33:51 crc kubenswrapper[4743]: I0310 16:33:51.926507 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" path="/var/lib/kubelet/pods/8f8d69ee-a4cc-488b-acef-e942adbb42ab/volumes" Mar 10 16:33:57 crc kubenswrapper[4743]: I0310 16:33:57.915964 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:33:57 crc kubenswrapper[4743]: E0310 16:33:57.916914 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.157217 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552674-bqsgd"] Mar 10 16:34:00 crc kubenswrapper[4743]: E0310 16:34:00.158235 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerName="extract-utilities" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.158254 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerName="extract-utilities" Mar 10 16:34:00 crc kubenswrapper[4743]: E0310 16:34:00.158279 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerName="extract-content" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.158287 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerName="extract-content" Mar 10 16:34:00 crc kubenswrapper[4743]: E0310 16:34:00.158321 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerName="registry-server" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.158330 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerName="registry-server" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.158584 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8d69ee-a4cc-488b-acef-e942adbb42ab" containerName="registry-server" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.159357 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.162501 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.162611 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.163559 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.172586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552674-bqsgd"] Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.324967 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9nlh\" (UniqueName: \"kubernetes.io/projected/bb665b94-88c1-4d19-955e-421f44637d5b-kube-api-access-c9nlh\") pod \"auto-csr-approver-29552674-bqsgd\" (UID: \"bb665b94-88c1-4d19-955e-421f44637d5b\") " pod="openshift-infra/auto-csr-approver-29552674-bqsgd" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.426729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9nlh\" (UniqueName: \"kubernetes.io/projected/bb665b94-88c1-4d19-955e-421f44637d5b-kube-api-access-c9nlh\") pod \"auto-csr-approver-29552674-bqsgd\" (UID: \"bb665b94-88c1-4d19-955e-421f44637d5b\") " pod="openshift-infra/auto-csr-approver-29552674-bqsgd" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.452331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9nlh\" (UniqueName: \"kubernetes.io/projected/bb665b94-88c1-4d19-955e-421f44637d5b-kube-api-access-c9nlh\") pod \"auto-csr-approver-29552674-bqsgd\" (UID: \"bb665b94-88c1-4d19-955e-421f44637d5b\") " pod="openshift-infra/auto-csr-approver-29552674-bqsgd" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.486854 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" Mar 10 16:34:00 crc kubenswrapper[4743]: I0310 16:34:00.937517 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552674-bqsgd"] Mar 10 16:34:01 crc kubenswrapper[4743]: I0310 16:34:01.902337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" event={"ID":"bb665b94-88c1-4d19-955e-421f44637d5b","Type":"ContainerStarted","Data":"f2ec920ee9833b6eb1dc0594b0377e7c8df04a05d420c2eebdfcbe8591bb8795"} Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.314656 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kksgf"] Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.318214 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.334822 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kksgf"] Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.368869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-utilities\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.368934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wthkk\" (UniqueName: \"kubernetes.io/projected/b3e0991f-df57-4a02-b753-8796c66bc80c-kube-api-access-wthkk\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.369341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-catalog-content\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.473737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-utilities\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.474185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wthkk\" (UniqueName: \"kubernetes.io/projected/b3e0991f-df57-4a02-b753-8796c66bc80c-kube-api-access-wthkk\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.474407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-catalog-content\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.475084 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-catalog-content\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.475330 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-utilities\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.515669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wthkk\" (UniqueName: \"kubernetes.io/projected/b3e0991f-df57-4a02-b753-8796c66bc80c-kube-api-access-wthkk\") pod \"community-operators-kksgf\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.661175 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.913120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" event={"ID":"bb665b94-88c1-4d19-955e-421f44637d5b","Type":"ContainerStarted","Data":"ebf9a190df90cb2157aabc1670892b8475a4a7998dce399a2684293d33163a9d"} Mar 10 16:34:02 crc kubenswrapper[4743]: I0310 16:34:02.931733 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" podStartSLOduration=1.5524808380000001 podStartE2EDuration="2.931714288s" podCreationTimestamp="2026-03-10 16:34:00 +0000 UTC" firstStartedPulling="2026-03-10 16:34:00.937523681 +0000 UTC m=+5305.644338439" lastFinishedPulling="2026-03-10 16:34:02.316757101 +0000 UTC m=+5307.023571889" observedRunningTime="2026-03-10 16:34:02.926627164 +0000 UTC m=+5307.633441912" watchObservedRunningTime="2026-03-10 16:34:02.931714288 +0000 UTC m=+5307.638529026" Mar 10 16:34:03 crc kubenswrapper[4743]: I0310 16:34:03.232968 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kksgf"] Mar 10 16:34:03 crc kubenswrapper[4743]: W0310 16:34:03.241003 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e0991f_df57_4a02_b753_8796c66bc80c.slice/crio-d5479573c0452663b8c469509c401c0226e3e0964d51cdca1f68b8e8c1e0890c WatchSource:0}: Error finding container d5479573c0452663b8c469509c401c0226e3e0964d51cdca1f68b8e8c1e0890c: Status 404 returned error can't find the container with id d5479573c0452663b8c469509c401c0226e3e0964d51cdca1f68b8e8c1e0890c Mar 10 16:34:03 crc kubenswrapper[4743]: I0310 16:34:03.922798 4743 generic.go:334] "Generic (PLEG): container finished" podID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerID="32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747" exitCode=0 Mar 10 16:34:03 crc kubenswrapper[4743]: I0310 16:34:03.925051 4743 generic.go:334] "Generic (PLEG): container finished" podID="bb665b94-88c1-4d19-955e-421f44637d5b" containerID="ebf9a190df90cb2157aabc1670892b8475a4a7998dce399a2684293d33163a9d" exitCode=0 Mar 10 16:34:03 crc kubenswrapper[4743]: I0310 16:34:03.937551 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksgf" event={"ID":"b3e0991f-df57-4a02-b753-8796c66bc80c","Type":"ContainerDied","Data":"32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747"} Mar 10 16:34:03 crc kubenswrapper[4743]: I0310 16:34:03.937600 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksgf" event={"ID":"b3e0991f-df57-4a02-b753-8796c66bc80c","Type":"ContainerStarted","Data":"d5479573c0452663b8c469509c401c0226e3e0964d51cdca1f68b8e8c1e0890c"} Mar 10 16:34:03 crc kubenswrapper[4743]: I0310 16:34:03.937613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" event={"ID":"bb665b94-88c1-4d19-955e-421f44637d5b","Type":"ContainerDied","Data":"ebf9a190df90cb2157aabc1670892b8475a4a7998dce399a2684293d33163a9d"} Mar 10 16:34:04 crc kubenswrapper[4743]: I0310 16:34:04.935078 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksgf" event={"ID":"b3e0991f-df57-4a02-b753-8796c66bc80c","Type":"ContainerStarted","Data":"f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3"} Mar 10 16:34:05 crc kubenswrapper[4743]: I0310 16:34:05.351443 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" Mar 10 16:34:05 crc kubenswrapper[4743]: I0310 16:34:05.438482 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9nlh\" (UniqueName: \"kubernetes.io/projected/bb665b94-88c1-4d19-955e-421f44637d5b-kube-api-access-c9nlh\") pod \"bb665b94-88c1-4d19-955e-421f44637d5b\" (UID: \"bb665b94-88c1-4d19-955e-421f44637d5b\") " Mar 10 16:34:05 crc kubenswrapper[4743]: I0310 16:34:05.449410 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb665b94-88c1-4d19-955e-421f44637d5b-kube-api-access-c9nlh" (OuterVolumeSpecName: "kube-api-access-c9nlh") pod "bb665b94-88c1-4d19-955e-421f44637d5b" (UID: "bb665b94-88c1-4d19-955e-421f44637d5b"). InnerVolumeSpecName "kube-api-access-c9nlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:34:05 crc kubenswrapper[4743]: I0310 16:34:05.541634 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9nlh\" (UniqueName: \"kubernetes.io/projected/bb665b94-88c1-4d19-955e-421f44637d5b-kube-api-access-c9nlh\") on node \"crc\" DevicePath \"\"" Mar 10 16:34:05 crc kubenswrapper[4743]: I0310 16:34:05.945764 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" Mar 10 16:34:05 crc kubenswrapper[4743]: I0310 16:34:05.946246 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552674-bqsgd" event={"ID":"bb665b94-88c1-4d19-955e-421f44637d5b","Type":"ContainerDied","Data":"f2ec920ee9833b6eb1dc0594b0377e7c8df04a05d420c2eebdfcbe8591bb8795"} Mar 10 16:34:05 crc kubenswrapper[4743]: I0310 16:34:05.946267 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2ec920ee9833b6eb1dc0594b0377e7c8df04a05d420c2eebdfcbe8591bb8795" Mar 10 16:34:06 crc kubenswrapper[4743]: I0310 16:34:06.012788 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552668-n92hv"] Mar 10 16:34:06 crc kubenswrapper[4743]: I0310 16:34:06.021387 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552668-n92hv"] Mar 10 16:34:06 crc kubenswrapper[4743]: I0310 16:34:06.958200 4743 generic.go:334] "Generic (PLEG): container finished" podID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerID="f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3" exitCode=0 Mar 10 16:34:06 crc kubenswrapper[4743]: I0310 16:34:06.958307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksgf" event={"ID":"b3e0991f-df57-4a02-b753-8796c66bc80c","Type":"ContainerDied","Data":"f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3"} Mar 10 16:34:07 crc kubenswrapper[4743]: I0310 16:34:07.926530 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403b250a-af4e-4306-9544-a821adfaf9d8" path="/var/lib/kubelet/pods/403b250a-af4e-4306-9544-a821adfaf9d8/volumes" Mar 10 16:34:07 crc kubenswrapper[4743]: I0310 16:34:07.970658 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksgf" event={"ID":"b3e0991f-df57-4a02-b753-8796c66bc80c","Type":"ContainerStarted","Data":"f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066"} Mar 10 16:34:10 crc kubenswrapper[4743]: I0310 16:34:10.916163 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:34:10 crc kubenswrapper[4743]: E0310 16:34:10.917286 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:34:12 crc kubenswrapper[4743]: I0310 16:34:12.662051 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:12 crc kubenswrapper[4743]: I0310 16:34:12.662381 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:12 crc kubenswrapper[4743]: I0310 16:34:12.736493 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:12 crc kubenswrapper[4743]: I0310 16:34:12.758763 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kksgf" podStartSLOduration=7.218480768 podStartE2EDuration="10.758743853s" podCreationTimestamp="2026-03-10 16:34:02 +0000 UTC" firstStartedPulling="2026-03-10 16:34:03.925409636 +0000 UTC m=+5308.632224384" lastFinishedPulling="2026-03-10 16:34:07.465672721 +0000 UTC m=+5312.172487469" observedRunningTime="2026-03-10 16:34:07.997757435 +0000 UTC m=+5312.704572183" watchObservedRunningTime="2026-03-10 16:34:12.758743853 +0000 UTC m=+5317.465558611" Mar 10 16:34:13 crc kubenswrapper[4743]: I0310 16:34:13.076208 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:13 crc kubenswrapper[4743]: I0310 16:34:13.153172 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kksgf"] Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.034722 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kksgf" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerName="registry-server" containerID="cri-o://f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066" gracePeriod=2 Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.576860 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.684696 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-catalog-content\") pod \"b3e0991f-df57-4a02-b753-8796c66bc80c\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.684772 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wthkk\" (UniqueName: \"kubernetes.io/projected/b3e0991f-df57-4a02-b753-8796c66bc80c-kube-api-access-wthkk\") pod \"b3e0991f-df57-4a02-b753-8796c66bc80c\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.684886 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-utilities\") pod \"b3e0991f-df57-4a02-b753-8796c66bc80c\" (UID: \"b3e0991f-df57-4a02-b753-8796c66bc80c\") " Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.685970 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-utilities" (OuterVolumeSpecName: "utilities") pod "b3e0991f-df57-4a02-b753-8796c66bc80c" (UID: "b3e0991f-df57-4a02-b753-8796c66bc80c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.709041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e0991f-df57-4a02-b753-8796c66bc80c-kube-api-access-wthkk" (OuterVolumeSpecName: "kube-api-access-wthkk") pod "b3e0991f-df57-4a02-b753-8796c66bc80c" (UID: "b3e0991f-df57-4a02-b753-8796c66bc80c"). InnerVolumeSpecName "kube-api-access-wthkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.757528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3e0991f-df57-4a02-b753-8796c66bc80c" (UID: "b3e0991f-df57-4a02-b753-8796c66bc80c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.786827 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.786867 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wthkk\" (UniqueName: \"kubernetes.io/projected/b3e0991f-df57-4a02-b753-8796c66bc80c-kube-api-access-wthkk\") on node \"crc\" DevicePath \"\"" Mar 10 16:34:15 crc kubenswrapper[4743]: I0310 16:34:15.786879 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e0991f-df57-4a02-b753-8796c66bc80c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.048082 4743 generic.go:334] "Generic (PLEG): container finished" podID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerID="f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066" exitCode=0 Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.048150 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksgf" event={"ID":"b3e0991f-df57-4a02-b753-8796c66bc80c","Type":"ContainerDied","Data":"f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066"} Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.048232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksgf" event={"ID":"b3e0991f-df57-4a02-b753-8796c66bc80c","Type":"ContainerDied","Data":"d5479573c0452663b8c469509c401c0226e3e0964d51cdca1f68b8e8c1e0890c"} Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.048256 4743 scope.go:117] "RemoveContainer" containerID="f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.048173 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kksgf" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.075716 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kksgf"] Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.081206 4743 scope.go:117] "RemoveContainer" containerID="f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.085977 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kksgf"] Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.108838 4743 scope.go:117] "RemoveContainer" containerID="32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.156677 4743 scope.go:117] "RemoveContainer" containerID="f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066" Mar 10 16:34:16 crc kubenswrapper[4743]: E0310 16:34:16.157247 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066\": container with ID starting with f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066 not found: ID does not exist" containerID="f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.157284 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066"} err="failed to get container status \"f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066\": rpc error: code = NotFound desc = could not find container \"f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066\": container with ID starting with f6f1a0e84970d18383fb78f141d85ee82caeff485304b05e5bc7c4fbc5a80066 not found: ID does not exist" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.157305 4743 scope.go:117] "RemoveContainer" containerID="f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3" Mar 10 16:34:16 crc kubenswrapper[4743]: E0310 16:34:16.157806 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3\": container with ID starting with f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3 not found: ID does not exist" containerID="f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.157855 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3"} err="failed to get container status \"f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3\": rpc error: code = NotFound desc = could not find container \"f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3\": container with ID starting with f70eaac98b3171df5154608af29aeec8f1fb3a15982c2e683087a196bc36b6f3 not found: ID does not exist" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.157873 4743 scope.go:117] "RemoveContainer" containerID="32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747" Mar 10 16:34:16 crc kubenswrapper[4743]: E0310 16:34:16.158150 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747\": container with ID starting with 32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747 not found: ID does not exist" containerID="32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747" Mar 10 16:34:16 crc kubenswrapper[4743]: I0310 16:34:16.158173 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747"} err="failed to get container status \"32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747\": rpc error: code = NotFound desc = could not find container \"32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747\": container with ID starting with 32965821c3538bca25401693fb3d4013876bbc696045826a87e59e1ce48bb747 not found: ID does not exist" Mar 10 16:34:17 crc kubenswrapper[4743]: I0310 16:34:17.927003 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" path="/var/lib/kubelet/pods/b3e0991f-df57-4a02-b753-8796c66bc80c/volumes" Mar 10 16:34:18 crc kubenswrapper[4743]: I0310 16:34:18.601581 4743 scope.go:117] "RemoveContainer" containerID="27bb71a669b1af44d2518310d2044069aebde082f618afa0b6d839e2580b459f" Mar 10 16:34:23 crc kubenswrapper[4743]: I0310 16:34:23.915625 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:34:23 crc kubenswrapper[4743]: E0310 16:34:23.916269 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:34:37 crc kubenswrapper[4743]: I0310 16:34:37.915607 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:34:37 crc kubenswrapper[4743]: E0310 16:34:37.916432 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:34:48 crc kubenswrapper[4743]: I0310 16:34:48.916627 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:34:48 crc kubenswrapper[4743]: E0310 16:34:48.917855 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:35:01 crc kubenswrapper[4743]: I0310 16:35:01.915808 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:35:01 crc kubenswrapper[4743]: E0310 16:35:01.916593 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qrnln_openshift-machine-config-operator(1d049bbf-95c6-4135-8808-1e453cf59a07)\"" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" Mar 10 16:35:13 crc kubenswrapper[4743]: I0310 16:35:13.915841 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739" Mar 10 16:35:14 crc kubenswrapper[4743]: I0310 16:35:14.647235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"b3edd852662d8f041fcc74af7d768c0109976f178b4b6a19ba855d2bc7b3005c"} Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.157607 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552676-wpz7k"] Mar 10 16:36:00 crc kubenswrapper[4743]: E0310 16:36:00.158851 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb665b94-88c1-4d19-955e-421f44637d5b" containerName="oc" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.158873 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb665b94-88c1-4d19-955e-421f44637d5b" containerName="oc" Mar 10 16:36:00 crc kubenswrapper[4743]: E0310 16:36:00.158893 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerName="extract-content" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.158905 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerName="extract-content" Mar 10 16:36:00 crc kubenswrapper[4743]: E0310 16:36:00.158936 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerName="registry-server" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.158946 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerName="registry-server" Mar 10 16:36:00 crc kubenswrapper[4743]: E0310 16:36:00.158987 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerName="extract-utilities" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.158998 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerName="extract-utilities" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.159314 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb665b94-88c1-4d19-955e-421f44637d5b" containerName="oc" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.159357 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e0991f-df57-4a02-b753-8796c66bc80c" containerName="registry-server" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.160355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552676-wpz7k" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.165258 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.165751 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.172876 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552676-wpz7k"] Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.174499 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.247508 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2zg\" (UniqueName: \"kubernetes.io/projected/27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc-kube-api-access-sj2zg\") pod \"auto-csr-approver-29552676-wpz7k\" (UID: \"27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc\") " pod="openshift-infra/auto-csr-approver-29552676-wpz7k" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.349015 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2zg\" (UniqueName: \"kubernetes.io/projected/27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc-kube-api-access-sj2zg\") pod \"auto-csr-approver-29552676-wpz7k\" (UID: \"27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc\") " pod="openshift-infra/auto-csr-approver-29552676-wpz7k" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.378544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2zg\" (UniqueName: \"kubernetes.io/projected/27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc-kube-api-access-sj2zg\") pod \"auto-csr-approver-29552676-wpz7k\" (UID: \"27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc\") " pod="openshift-infra/auto-csr-approver-29552676-wpz7k" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.492362 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552676-wpz7k" Mar 10 16:36:00 crc kubenswrapper[4743]: I0310 16:36:00.958731 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552676-wpz7k"] Mar 10 16:36:01 crc kubenswrapper[4743]: I0310 16:36:01.104697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552676-wpz7k" event={"ID":"27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc","Type":"ContainerStarted","Data":"314557739daf8f489e725a46fa1d3d8eeb582ad8c55990f6ad51ce69cdc0b907"} Mar 10 16:36:06 crc kubenswrapper[4743]: I0310 16:36:06.155962 4743 generic.go:334] "Generic (PLEG): container finished" podID="27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc" containerID="7ac6a215abe657c59bc75ad0c5f5433816eb3f58e627a6ef7653cb2cf14318fc" exitCode=0 Mar 10 16:36:06 crc kubenswrapper[4743]: I0310 16:36:06.156050 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552676-wpz7k" event={"ID":"27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc","Type":"ContainerDied","Data":"7ac6a215abe657c59bc75ad0c5f5433816eb3f58e627a6ef7653cb2cf14318fc"} Mar 10 16:36:07 crc kubenswrapper[4743]: I0310 16:36:07.518293 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552676-wpz7k" Mar 10 16:36:07 crc kubenswrapper[4743]: I0310 16:36:07.637718 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj2zg\" (UniqueName: \"kubernetes.io/projected/27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc-kube-api-access-sj2zg\") pod \"27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc\" (UID: \"27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc\") " Mar 10 16:36:07 crc kubenswrapper[4743]: I0310 16:36:07.644906 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc-kube-api-access-sj2zg" (OuterVolumeSpecName: "kube-api-access-sj2zg") pod "27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc" (UID: "27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc"). InnerVolumeSpecName "kube-api-access-sj2zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:36:07 crc kubenswrapper[4743]: I0310 16:36:07.740308 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj2zg\" (UniqueName: \"kubernetes.io/projected/27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc-kube-api-access-sj2zg\") on node \"crc\" DevicePath \"\"" Mar 10 16:36:08 crc kubenswrapper[4743]: I0310 16:36:08.175941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552676-wpz7k" event={"ID":"27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc","Type":"ContainerDied","Data":"314557739daf8f489e725a46fa1d3d8eeb582ad8c55990f6ad51ce69cdc0b907"} Mar 10 16:36:08 crc kubenswrapper[4743]: I0310 16:36:08.176223 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="314557739daf8f489e725a46fa1d3d8eeb582ad8c55990f6ad51ce69cdc0b907" Mar 10 16:36:08 crc kubenswrapper[4743]: I0310 16:36:08.175974 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552676-wpz7k" Mar 10 16:36:08 crc kubenswrapper[4743]: I0310 16:36:08.602468 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552670-wpqm9"] Mar 10 16:36:08 crc kubenswrapper[4743]: I0310 16:36:08.613691 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552670-wpqm9"] Mar 10 16:36:09 crc kubenswrapper[4743]: I0310 16:36:09.930048 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9233029-7319-4fed-b056-0bbeb8831ac5" path="/var/lib/kubelet/pods/d9233029-7319-4fed-b056-0bbeb8831ac5/volumes" Mar 10 16:36:18 crc kubenswrapper[4743]: I0310 16:36:18.859044 4743 scope.go:117] "RemoveContainer" containerID="19d722dcdf37551f6498072e6554fa1cbf678e3c92ba17ece31be8d43bfed3c2" Mar 10 16:37:41 crc kubenswrapper[4743]: I0310 16:37:41.252801 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:37:41 crc kubenswrapper[4743]: I0310 16:37:41.253483 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.155373 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552678-sz6c4"] Mar 10 16:38:00 crc kubenswrapper[4743]: E0310 16:38:00.156321 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc" containerName="oc" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.156335 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc" containerName="oc" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.156545 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c6cfca-4aef-4ea4-b6c8-f8c55b8ee1cc" containerName="oc" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.157206 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552678-sz6c4" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.162556 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.162570 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-hqm5p" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.162647 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.165651 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552678-sz6c4"] Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.236224 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nswq8\" (UniqueName: \"kubernetes.io/projected/b16eee00-ff6d-4750-b09a-d11a4ca322b7-kube-api-access-nswq8\") pod \"auto-csr-approver-29552678-sz6c4\" (UID: \"b16eee00-ff6d-4750-b09a-d11a4ca322b7\") " pod="openshift-infra/auto-csr-approver-29552678-sz6c4" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.337918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nswq8\" (UniqueName: \"kubernetes.io/projected/b16eee00-ff6d-4750-b09a-d11a4ca322b7-kube-api-access-nswq8\") pod \"auto-csr-approver-29552678-sz6c4\" (UID: \"b16eee00-ff6d-4750-b09a-d11a4ca322b7\") " pod="openshift-infra/auto-csr-approver-29552678-sz6c4" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.366712 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nswq8\" (UniqueName: \"kubernetes.io/projected/b16eee00-ff6d-4750-b09a-d11a4ca322b7-kube-api-access-nswq8\") pod \"auto-csr-approver-29552678-sz6c4\" (UID: \"b16eee00-ff6d-4750-b09a-d11a4ca322b7\") " pod="openshift-infra/auto-csr-approver-29552678-sz6c4" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.480086 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552678-sz6c4" Mar 10 16:38:00 crc kubenswrapper[4743]: I0310 16:38:00.941997 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552678-sz6c4"] Mar 10 16:38:01 crc kubenswrapper[4743]: I0310 16:38:01.412651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552678-sz6c4" event={"ID":"b16eee00-ff6d-4750-b09a-d11a4ca322b7","Type":"ContainerStarted","Data":"2651f3af21392b6df550037329a4b72524edda6befac7b31b463bac57c9f12e0"} Mar 10 16:38:03 crc kubenswrapper[4743]: I0310 16:38:03.433678 4743 generic.go:334] "Generic (PLEG): container finished" podID="b16eee00-ff6d-4750-b09a-d11a4ca322b7" containerID="c1ca916cca1521a7d2be125aed511492377f9a1bed80884ea1c42ca2c1808a76" exitCode=0 Mar 10 16:38:03 crc kubenswrapper[4743]: I0310 16:38:03.433804 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552678-sz6c4" event={"ID":"b16eee00-ff6d-4750-b09a-d11a4ca322b7","Type":"ContainerDied","Data":"c1ca916cca1521a7d2be125aed511492377f9a1bed80884ea1c42ca2c1808a76"} Mar 10 16:38:04 crc kubenswrapper[4743]: I0310 16:38:04.772905 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552678-sz6c4" Mar 10 16:38:04 crc kubenswrapper[4743]: I0310 16:38:04.832202 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nswq8\" (UniqueName: \"kubernetes.io/projected/b16eee00-ff6d-4750-b09a-d11a4ca322b7-kube-api-access-nswq8\") pod \"b16eee00-ff6d-4750-b09a-d11a4ca322b7\" (UID: \"b16eee00-ff6d-4750-b09a-d11a4ca322b7\") " Mar 10 16:38:04 crc kubenswrapper[4743]: I0310 16:38:04.839063 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b16eee00-ff6d-4750-b09a-d11a4ca322b7-kube-api-access-nswq8" (OuterVolumeSpecName: "kube-api-access-nswq8") pod "b16eee00-ff6d-4750-b09a-d11a4ca322b7" (UID: "b16eee00-ff6d-4750-b09a-d11a4ca322b7"). InnerVolumeSpecName "kube-api-access-nswq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:38:04 crc kubenswrapper[4743]: I0310 16:38:04.935109 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nswq8\" (UniqueName: \"kubernetes.io/projected/b16eee00-ff6d-4750-b09a-d11a4ca322b7-kube-api-access-nswq8\") on node \"crc\" DevicePath \"\"" Mar 10 16:38:05 crc kubenswrapper[4743]: I0310 16:38:05.455665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552678-sz6c4" event={"ID":"b16eee00-ff6d-4750-b09a-d11a4ca322b7","Type":"ContainerDied","Data":"2651f3af21392b6df550037329a4b72524edda6befac7b31b463bac57c9f12e0"} Mar 10 16:38:05 crc kubenswrapper[4743]: I0310 16:38:05.455709 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2651f3af21392b6df550037329a4b72524edda6befac7b31b463bac57c9f12e0" Mar 10 16:38:05 crc kubenswrapper[4743]: I0310 16:38:05.455769 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552678-sz6c4" Mar 10 16:38:05 crc kubenswrapper[4743]: I0310 16:38:05.854802 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552672-zhxpv"] Mar 10 16:38:05 crc kubenswrapper[4743]: I0310 16:38:05.866022 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552672-zhxpv"] Mar 10 16:38:05 crc kubenswrapper[4743]: I0310 16:38:05.927766 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c534b8a9-c60b-4307-967d-e1fe25a4a451" path="/var/lib/kubelet/pods/c534b8a9-c60b-4307-967d-e1fe25a4a451/volumes" Mar 10 16:38:11 crc kubenswrapper[4743]: I0310 16:38:11.252862 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:38:11 crc kubenswrapper[4743]: I0310 16:38:11.253591 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:38:18 crc kubenswrapper[4743]: I0310 16:38:18.969628 4743 scope.go:117] "RemoveContainer" containerID="50a59119eee51b1af8077904069a78ac64ebdf1b610a58cfab7e9807bc73adba" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.827838 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t9c9s"] Mar 10 16:38:22 crc kubenswrapper[4743]: E0310 16:38:22.829088 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16eee00-ff6d-4750-b09a-d11a4ca322b7" containerName="oc" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.829107 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16eee00-ff6d-4750-b09a-d11a4ca322b7" containerName="oc" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.829424 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16eee00-ff6d-4750-b09a-d11a4ca322b7" containerName="oc" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.831116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.840359 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9c9s"] Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.868304 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-utilities\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.868391 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-catalog-content\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.868497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j876p\" (UniqueName: \"kubernetes.io/projected/4e56b904-f9eb-447d-979b-5b5e18d0fec3-kube-api-access-j876p\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.971297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-utilities\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.972185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-catalog-content\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.972401 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j876p\" (UniqueName: \"kubernetes.io/projected/4e56b904-f9eb-447d-979b-5b5e18d0fec3-kube-api-access-j876p\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.972059 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-utilities\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.973570 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-catalog-content\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:22 crc kubenswrapper[4743]: I0310 16:38:22.998346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j876p\" (UniqueName: \"kubernetes.io/projected/4e56b904-f9eb-447d-979b-5b5e18d0fec3-kube-api-access-j876p\") pod \"redhat-marketplace-t9c9s\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:23 crc kubenswrapper[4743]: I0310 16:38:23.161146 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:23 crc kubenswrapper[4743]: I0310 16:38:23.632603 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9c9s"] Mar 10 16:38:24 crc kubenswrapper[4743]: I0310 16:38:24.663356 4743 generic.go:334] "Generic (PLEG): container finished" podID="4e56b904-f9eb-447d-979b-5b5e18d0fec3" containerID="784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8" exitCode=0 Mar 10 16:38:24 crc kubenswrapper[4743]: I0310 16:38:24.663593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9c9s" event={"ID":"4e56b904-f9eb-447d-979b-5b5e18d0fec3","Type":"ContainerDied","Data":"784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8"} Mar 10 16:38:24 crc kubenswrapper[4743]: I0310 16:38:24.663723 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9c9s" event={"ID":"4e56b904-f9eb-447d-979b-5b5e18d0fec3","Type":"ContainerStarted","Data":"644594dd4c2694b544313259138667d20e07b519977985e9c5a80308ad376097"} Mar 10 16:38:25 crc kubenswrapper[4743]: I0310 16:38:25.677233 4743 generic.go:334] "Generic (PLEG): container finished" podID="4e56b904-f9eb-447d-979b-5b5e18d0fec3" containerID="5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0" exitCode=0 Mar 10 16:38:25 crc kubenswrapper[4743]: I0310 16:38:25.677356 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9c9s" event={"ID":"4e56b904-f9eb-447d-979b-5b5e18d0fec3","Type":"ContainerDied","Data":"5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0"} Mar 10 16:38:26 crc kubenswrapper[4743]: I0310 16:38:26.699123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9c9s" event={"ID":"4e56b904-f9eb-447d-979b-5b5e18d0fec3","Type":"ContainerStarted","Data":"dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938"} Mar 10 16:38:26 crc kubenswrapper[4743]: I0310 16:38:26.745743 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t9c9s" podStartSLOduration=3.332901658 podStartE2EDuration="4.745722813s" podCreationTimestamp="2026-03-10 16:38:22 +0000 UTC" firstStartedPulling="2026-03-10 16:38:24.665564484 +0000 UTC m=+5569.372379232" lastFinishedPulling="2026-03-10 16:38:26.078385629 +0000 UTC m=+5570.785200387" observedRunningTime="2026-03-10 16:38:26.743775488 +0000 UTC m=+5571.450590236" watchObservedRunningTime="2026-03-10 16:38:26.745722813 +0000 UTC m=+5571.452537561" Mar 10 16:38:33 crc kubenswrapper[4743]: I0310 16:38:33.162546 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:33 crc kubenswrapper[4743]: I0310 16:38:33.162999 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:33 crc kubenswrapper[4743]: I0310 16:38:33.220163 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:33 crc kubenswrapper[4743]: I0310 16:38:33.841545 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:33 crc kubenswrapper[4743]: I0310 16:38:33.939622 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9c9s"] Mar 10 16:38:35 crc kubenswrapper[4743]: I0310 16:38:35.776339 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t9c9s" podUID="4e56b904-f9eb-447d-979b-5b5e18d0fec3" containerName="registry-server" containerID="cri-o://dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938" gracePeriod=2 Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.353500 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.449204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-utilities\") pod \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.449426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-catalog-content\") pod \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.449655 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j876p\" (UniqueName: \"kubernetes.io/projected/4e56b904-f9eb-447d-979b-5b5e18d0fec3-kube-api-access-j876p\") pod \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\" (UID: \"4e56b904-f9eb-447d-979b-5b5e18d0fec3\") " Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.450022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-utilities" (OuterVolumeSpecName: "utilities") pod "4e56b904-f9eb-447d-979b-5b5e18d0fec3" (UID: "4e56b904-f9eb-447d-979b-5b5e18d0fec3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.450470 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.463219 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e56b904-f9eb-447d-979b-5b5e18d0fec3-kube-api-access-j876p" (OuterVolumeSpecName: "kube-api-access-j876p") pod "4e56b904-f9eb-447d-979b-5b5e18d0fec3" (UID: "4e56b904-f9eb-447d-979b-5b5e18d0fec3"). InnerVolumeSpecName "kube-api-access-j876p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.482350 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e56b904-f9eb-447d-979b-5b5e18d0fec3" (UID: "4e56b904-f9eb-447d-979b-5b5e18d0fec3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.551602 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e56b904-f9eb-447d-979b-5b5e18d0fec3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.551637 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j876p\" (UniqueName: \"kubernetes.io/projected/4e56b904-f9eb-447d-979b-5b5e18d0fec3-kube-api-access-j876p\") on node \"crc\" DevicePath \"\"" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.788612 4743 generic.go:334] "Generic (PLEG): container finished" podID="4e56b904-f9eb-447d-979b-5b5e18d0fec3" containerID="dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938" exitCode=0 Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.788676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9c9s" event={"ID":"4e56b904-f9eb-447d-979b-5b5e18d0fec3","Type":"ContainerDied","Data":"dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938"} Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.788718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9c9s" event={"ID":"4e56b904-f9eb-447d-979b-5b5e18d0fec3","Type":"ContainerDied","Data":"644594dd4c2694b544313259138667d20e07b519977985e9c5a80308ad376097"} Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.788731 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9c9s" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.788749 4743 scope.go:117] "RemoveContainer" containerID="dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.839132 4743 scope.go:117] "RemoveContainer" containerID="5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.850120 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9c9s"] Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.860947 4743 scope.go:117] "RemoveContainer" containerID="784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.879434 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9c9s"] Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.930991 4743 scope.go:117] "RemoveContainer" containerID="dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938" Mar 10 16:38:36 crc kubenswrapper[4743]: E0310 16:38:36.931556 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938\": container with ID starting with dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938 not found: ID does not exist" containerID="dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.931618 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938"} err="failed to get container status \"dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938\": rpc error: code = NotFound desc = could not find container \"dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938\": container with ID starting with dcab4026f682f7a710dcc58780e1f79ba44462582c66c0090cb06b23742db938 not found: ID does not exist" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.931651 4743 scope.go:117] "RemoveContainer" containerID="5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0" Mar 10 16:38:36 crc kubenswrapper[4743]: E0310 16:38:36.932029 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0\": container with ID starting with 5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0 not found: ID does not exist" containerID="5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.932103 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0"} err="failed to get container status \"5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0\": rpc error: code = NotFound desc = could not find container \"5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0\": container with ID starting with 5eb90d1a3ad787cab443e4564a8264100b87ee45b8b94f6cb85b8075eb0034f0 not found: ID does not exist" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.932136 4743 scope.go:117] "RemoveContainer" containerID="784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8" Mar 10 16:38:36 crc kubenswrapper[4743]: E0310 16:38:36.932434 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8\": container with ID starting with 784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8 not found: ID does not exist" containerID="784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8" Mar 10 16:38:36 crc kubenswrapper[4743]: I0310 16:38:36.932461 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8"} err="failed to get container status \"784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8\": rpc error: code = NotFound desc = could not find container \"784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8\": container with ID starting with 784cc67c6632385f762c8eb4f5fa10e177d647558c4e2d7cf7cdc61eed980fb8 not found: ID does not exist" Mar 10 16:38:37 crc kubenswrapper[4743]: I0310 16:38:37.926352 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e56b904-f9eb-447d-979b-5b5e18d0fec3" path="/var/lib/kubelet/pods/4e56b904-f9eb-447d-979b-5b5e18d0fec3/volumes" Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.253034 4743 patch_prober.go:28] interesting pod/machine-config-daemon-qrnln container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.253627 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.253688 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.254694 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3edd852662d8f041fcc74af7d768c0109976f178b4b6a19ba855d2bc7b3005c"} pod="openshift-machine-config-operator/machine-config-daemon-qrnln" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.254779 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" podUID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerName="machine-config-daemon" containerID="cri-o://b3edd852662d8f041fcc74af7d768c0109976f178b4b6a19ba855d2bc7b3005c" gracePeriod=600 Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.836921 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d049bbf-95c6-4135-8808-1e453cf59a07" containerID="b3edd852662d8f041fcc74af7d768c0109976f178b4b6a19ba855d2bc7b3005c" exitCode=0 Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.836954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerDied","Data":"b3edd852662d8f041fcc74af7d768c0109976f178b4b6a19ba855d2bc7b3005c"} Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.837633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qrnln" event={"ID":"1d049bbf-95c6-4135-8808-1e453cf59a07","Type":"ContainerStarted","Data":"7eb0273e25933b35f473c4e994095484f65adef54ef567a3cf6065fe5623fd52"} Mar 10 16:38:41 crc kubenswrapper[4743]: I0310 16:38:41.837656 4743 scope.go:117] "RemoveContainer" containerID="4ab69fb79c800cba95fd9c6639120478afbc78f9bc3e8388fa1a66777da66739"